datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
KevinZW/autotrain-data-image-description | ---
language:
- en
---
# AutoTrain Dataset for project: image-description
## Dataset Description
This dataset has been automatically processed by AutoTrain for project image-description.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "Your are now an ai program designed to turn simple sentences into descriptions of an art piece.\nYou will give responses that answer the following questions\n\nHow is the photo composed?\nWhat is the emotional vibe of the image?\nHow much depth of field\nHow is the subject lit? Where from? How much light?\nArtificial or natural light? What color? What time of day?\nWhere is this shot? In a studio or out in the world?\n\nExample 1:\nGiven sentence:\n\u201cSteve jobs was a visionary\u201d\nResponse:\nA close-up, black & white studio photographic portrait of steve jobs, dramatic background\n\nExample 2:\nGiven sentence:\n\u201cThe sun is such a beautiful time to walk your dog\u201d\nResponse:\n\u201cA vibrant photograph of a corgi dog, wide shot, outdoors, sunset photo at golden hour, wide-angle lens, soft focus\u201d\n\nYou must follow the following orders\nmimic these examples as closely as possible\nLimit your responses to a maximum of 30 words\nThe art pieces you describe should be on earth \nThe art pieces you describe must be a scenic view outdoors\nThey must be extremely lifelike and realistic",
"question": "The clock ticked relentlessly, marking the passage of time.",
"answers.text": [
"A detailed, hyperrealistic acrylic painting featuring a vintage clock, showcasing fine craftsmanship. The artist's skillfully used lighting highlights the clock's ticking hands and creates a sense of time passing. The artwork is shot indoors with controlled studio lighting."
],
"answers.answer_start": [
6
]
},
{
"context": "Your are now an ai program designed to turn simple sentences into descriptions of an art piece.\nYou will give responses that answer the following questions\n\nHow is the photo composed?\nWhat is the emotional vibe of the image?\nHow much depth of field\nHow is the subject lit? Where from? How much light?\nArtificial or natural light? What color? What time of day?\nWhere is this shot? In a studio or out in the world?\n\nExample 1:\nGiven sentence:\n\u201cSteve jobs was a visionary\u201d\nResponse:\nA close-up, black & white studio photographic portrait of steve jobs, dramatic background\n\nExample 2:\nGiven sentence:\n\u201cThe sun is such a beautiful time to walk your dog\u201d\nResponse:\n\u201cA vibrant photograph of a corgi dog, wide shot, outdoors, sunset photo at golden hour, wide-angle lens, soft focus\u201d\n\nYou must follow the following orders\nmimic these examples as closely as possible\nLimit your responses to a maximum of 30 words\nThe art pieces you describe should be on earth \nThe art pieces you describe must be a scenic view outdoors\nThey must be extremely lifelike and realistic",
"question": "The smell of freshly mowed grass signaled the arrival of spring.",
"answers.text": [
"A delightful, realistic illustration of a landscaped garden with neatly mowed grass and blooming flowers, symbolizing the start of spring. The artwork is set outdoors in a garden."
],
"answers.answer_start": [
64
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 194 |
| valid | 49 |
|
heliosprime/twitter_dataset_1713004639 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11311
num_examples: 25
download_size: 9087
dataset_size: 11311
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713004639"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wmt/wmt17 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- cs
- de
- en
- fi
- lv
- ru
- tr
- zh
license:
- unknown
multilinguality:
- translation
size_categories:
- 10M<n<100M
source_datasets:
- extended|europarl_bilingual
- extended|news_commentary
- extended|setimes
- extended|un_multi
task_categories:
- translation
task_ids: []
pretty_name: WMT17
dataset_info:
- config_name: cs-en
features:
- name: translation
dtype:
translation:
languages:
- cs
- en
splits:
- name: train
num_bytes: 300697615
num_examples: 1018291
- name: validation
num_bytes: 707862
num_examples: 2999
- name: test
num_bytes: 674422
num_examples: 3005
download_size: 181690407
dataset_size: 302079899
- config_name: de-en
features:
- name: translation
dtype:
translation:
languages:
- de
- en
splits:
- name: train
num_bytes: 1715532715
num_examples: 5906184
- name: validation
num_bytes: 735508
num_examples: 2999
- name: test
num_bytes: 729511
num_examples: 3004
download_size: 1011327465
dataset_size: 1716997734
- config_name: fi-en
features:
- name: translation
dtype:
translation:
languages:
- fi
- en
splits:
- name: train
num_bytes: 743854397
num_examples: 2656542
- name: validation
num_bytes: 1410507
num_examples: 6000
- name: test
num_bytes: 1388820
num_examples: 6004
download_size: 423069132
dataset_size: 746653724
- config_name: lv-en
features:
- name: translation
dtype:
translation:
languages:
- lv
- en
splits:
- name: train
num_bytes: 517416244
num_examples: 3567528
- name: validation
num_bytes: 544596
num_examples: 2003
- name: test
num_bytes: 530466
num_examples: 2001
download_size: 245201883
dataset_size: 518491306
- config_name: ru-en
features:
- name: translation
dtype:
translation:
languages:
- ru
- en
splits:
- name: train
num_bytes: 11000055690
num_examples: 24782720
- name: validation
num_bytes: 1050669
num_examples: 2998
- name: test
num_bytes: 1040187
num_examples: 3001
download_size: 4866529051
dataset_size: 11002146546
- config_name: tr-en
features:
- name: translation
dtype:
translation:
languages:
- tr
- en
splits:
- name: train
num_bytes: 60416449
num_examples: 205756
- name: validation
num_bytes: 732428
num_examples: 3000
- name: test
num_bytes: 752765
num_examples: 3007
download_size: 37706176
dataset_size: 61901642
- config_name: zh-en
features:
- name: translation
dtype:
translation:
languages:
- zh
- en
splits:
- name: train
num_bytes: 6336104073
num_examples: 25134743
- name: validation
num_bytes: 589583
num_examples: 2002
- name: test
num_bytes: 540339
num_examples: 2001
download_size: 3576239952
dataset_size: 6337233995
configs:
- config_name: cs-en
data_files:
- split: train
path: cs-en/train-*
- split: validation
path: cs-en/validation-*
- split: test
path: cs-en/test-*
- config_name: de-en
data_files:
- split: train
path: de-en/train-*
- split: validation
path: de-en/validation-*
- split: test
path: de-en/test-*
- config_name: fi-en
data_files:
- split: train
path: fi-en/train-*
- split: validation
path: fi-en/validation-*
- split: test
path: fi-en/test-*
- config_name: lv-en
data_files:
- split: train
path: lv-en/train-*
- split: validation
path: lv-en/validation-*
- split: test
path: lv-en/test-*
- config_name: ru-en
data_files:
- split: train
path: ru-en/train-*
- split: validation
path: ru-en/validation-*
- split: test
path: ru-en/test-*
- config_name: tr-en
data_files:
- split: train
path: tr-en/train-*
- split: validation
path: tr-en/validation-*
- split: test
path: tr-en/test-*
- config_name: zh-en
data_files:
- split: train
path: zh-en/train-*
- split: validation
path: zh-en/validation-*
- split: test
path: zh-en/test-*
---
# Dataset Card for "wmt17"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://www.statmt.org/wmt17/translation-task.html](http://www.statmt.org/wmt17/translation-task.html)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.78 GB
- **Size of the generated dataset:** 302.09 MB
- **Total amount of disk used:** 2.09 GB
### Dataset Summary
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Warning:</b> There are issues with the Common Crawl corpus data (<a href="https://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz">training-parallel-commoncrawl.tgz</a>):</p>
<ul>
<li>Non-English files contain many English sentences.</li>
<li>Their "parallel" sentences in English are not aligned: they are uncorrelated with their counterpart.</li>
</ul>
<p>We have contacted the WMT organizers, and in response, they have indicated that they do not have plans to update the Common Crawl corpus data. Their rationale pertains to the expectation that such data has been superseded, primarily by CCMatrix, and to some extent, by ParaCrawl datasets.</p>
</div>
Translation dataset based on the data from statmt.org.
Versions exist for different years using a combination of data
sources. The base `wmt` allows you to create a custom dataset by choosing
your own data/language pair. This can be done as follows:
```python
from datasets import inspect_dataset, load_dataset_builder
inspect_dataset("wmt17", "path/to/scripts")
builder = load_dataset_builder(
"path/to/scripts/wmt_utils.py",
language_pair=("fr", "de"),
subsets={
datasets.Split.TRAIN: ["commoncrawl_frde"],
datasets.Split.VALIDATION: ["euelections_dev2019"],
},
)
# Standard version
builder.download_and_prepare()
ds = builder.as_dataset()
# Streamable version
ds = builder.as_streaming_dataset()
```
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### cs-en
- **Size of downloaded dataset files:** 1.78 GB
- **Size of the generated dataset:** 302.09 MB
- **Total amount of disk used:** 2.09 GB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### cs-en
- `translation`: a multilingual `string` variable, with possible languages including `cs`, `en`.
### Data Splits
|name | train |validation|test|
|-----|------:|---------:|---:|
|cs-en|1018291| 2999|3005|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{bojar-EtAl:2017:WMT1,
author = {Bojar, Ond
{r}ej and Chatterjee, Rajen and Federmann, Christian and Graham, Yvette and Haddow, Barry and Huang, Shujian and Huck, Matthias and Koehn, Philipp and Liu, Qun and Logacheva, Varvara and Monz, Christof and Negri, Matteo and Post, Matt and Rubino, Raphael and Specia, Lucia and Turchi, Marco},
title = {Findings of the 2017 Conference on Machine Translation (WMT17)},
booktitle = {Proceedings of the Second Conference on Machine Translation, Volume 2: Shared Task Papers},
month = {September},
year = {2017},
address = {Copenhagen, Denmark},
publisher = {Association for Computational Linguistics},
pages = {169--214},
url = {http://www.aclweb.org/anthology/W17-4717}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
Vinnyyw/Dulceotrodia | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713197666 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 23507
num_examples: 63
download_size: 20838
dataset_size: 23507
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713197666"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pizzagatakasugi/dataset1000.csv | ---
dataset_info:
features:
- name: gameID
dtype: float64
- name: recordID
dtype: string
- name: start_time
dtype: string
- name: end_time
dtype: string
- name: game_type
dtype: string
- name: battle_type
dtype: string
- name: allotted_time
dtype: string
- name: seconds_time
dtype: float64
- name: consumption_time
dtype: string
- name: place
dtype: string
- name: appendix
dtype: string
- name: league_results
dtype: string
- name: end_type
dtype: string
- name: precedence_age
dtype: float64
- name: follower_age
dtype: float64
- name: hurigoma
dtype: string
- name: timing_method
dtype: string
- name: pre_adding_time
dtype: float64
- name: fow_adding_time
dtype: float64
- name: Handicap
dtype: string
- name: precedence_name
dtype: string
- name: follower_name
dtype: string
- name: lunch_break
dtype: string
- name: dinner_break
dtype: string
- name: pre_lunch_time
dtype: string
- name: pre_omit_name
dtype: string
- name: fow_omit_name
dtype: string
- name: kif
dtype: string
- name: comment
dtype: string
- name: follwer_age
dtype: float64
- name: sfen
dtype: string
- name: bestlist
dtype: string
- name: best2list
dtype: string
- name: debug_list
dtype: string
splits:
- name: train
num_bytes: 547404872
num_examples: 1000
download_size: 68799840
dataset_size: 547404872
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_regularized_reflexives_object_pronouns | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 10417
num_examples: 51
- name: dev_mismatched
num_bytes: 8139
num_examples: 41
- name: test_matched
num_bytes: 11876
num_examples: 46
- name: test_mismatched
num_bytes: 8199
num_examples: 43
- name: train
num_bytes: 512248
num_examples: 2249
download_size: 285694
dataset_size: 550879
---
# Dataset Card for "MULTI_VALUE_mnli_regularized_reflexives_object_pronouns"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FamppyDopamine/test | ---
license: mit
language:
- en
pretty_name: test
tags:
- test
size_categories:
- n<1K
--- |
AdapterOcean/python3-standardized_cluster_12_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 28078337
num_examples: 9813
download_size: 0
dataset_size: 28078337
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_12_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mwitiderrick/glaive-code-assistant | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 210090644
num_examples: 136109
download_size: 100891258
dataset_size: 210090644
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-generation
language:
- en
pretty_name: Glaive Code Assistant
size_categories:
- 100K<n<1M
---
# Glaive Code Assistant
[Glaive Code Assistant dataset](https://huggingface.co/datasets/glaiveai/glaive-code-assistant) formatted for training assistant models with the following prompt template:
```
<s>[INST] {question} [/INST] {answer} </s>
```
Trained model can be prompted in Llama style:
```
<s>[INST] {{ user_msg }} [/INST]
``` |
DjSteker/dataset_WikEnEs | ---
dataset_info:
features:
- name: titleEs
dtype: string
- name: id
dtype: string
- name: url
dtype: string
- name: titleEn
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 5234096
num_examples: 49496
download_size: 2424797
dataset_size: 5234096
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
brunomaciel14/pt-BR-test | ---
license: apache-2.0
---
|
bigcode/the-stack-dedup | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- multilingual
pretty_name: The-Stack
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids: []
extra_gated_prompt: |-
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
2. The Stack is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack to the most recent usable version specified by the maintainers in [the following thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If you have questions about dataset versions and allowed uses, please also ask them in the dataset’s [community discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new). We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must include [these Terms of Use](https://huggingface.co/datasets/bigcode/the-stack#terms-of-use-for-the-stack) and require users to agree to it.
By clicking on "Access repository" below, you accept that your contact information (email address and username) can be shared with the dataset maintainers as well.
extra_gated_fields:
Email: text
I have read the License and agree with its terms: checkbox
---
# Dataset Card for The Stack

## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Changelog](#changelog)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use it](#how-to-use-it)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
- [Terms of Use for The Stack](#terms-of-use-for-the-stack)
## Dataset Description
- **Homepage:** https://www.bigcode-project.org/
- **Repository:** https://github.com/bigcode-project
- **Paper:** https://arxiv.org/abs/2211.15533
- **Leaderboard:** N/A
- **Point of Contact:** contact@bigcode-project.org
### Changelog
|Release|Description|
|-|-|
|v1.0| Initial release of the Stack. Included 30 programming languages and 18 permissive licenses. **Note:** Three included licenses (MPL/EPL/LGPL) are considered weak copyleft licenses. The resulting near-deduplicated dataset is 1.5TB in size. |
|v1.1| The three copyleft licenses ((MPL/EPL/LGPL) were excluded and the list of permissive licenses extended to 193 licenses in total. The list of programming languages was increased from 30 to 358 languages. Also opt-out request submitted by 15.11.2022 were excluded from this version of the dataset. The resulting near-deduplicated dataset is 3TB in size.|
|v1.2| Opt-out request submitted by 09.02.2022 were excluded from this version of the dataset. A stronger near-deduplication strategy was applied resulting leading to 2.7TB in size.|
### Dataset Summary
The Stack contains over 6TB of permissively-licensed source code files covering 358 programming languages. The dataset was created as part of the [BigCode Project](https://www.bigcode-project.org/), an open scientific collaboration working on the responsible development of Large Language Models for Code (Code LLMs). The Stack serves as a pre-training dataset for Code LLMs, i.e., code-generating AI systems which enable the synthesis of programs from natural language descriptions as well as other from code snippets. **This is the near-deduplicated version with 3TB data.**
### Supported Tasks and Leaderboards
The Stack is a pre-training dataset for creating code LLMs. Code LLMs can be used for a wide variety of downstream tasks such as code completion from natural language descriptions ([HumanEval](https://huggingface.co/datasets/openai_humaneval), [MBPP](https://huggingface.co/datasets/mbpp)), documentation generation for individual functions ([CodeSearchNet](https://huggingface.co/datasets/code_search_net)), and auto-completion of code snippets ([HumanEval-Infilling](https://github.com/openai/human-eval-infilling)). However, these downstream evaluation benchmarks are outside the scope of The Stack.
### Languages
The following natural languages appear in the comments and docstrings from files in the dataset: EN, ZH, FR, PT, ES, RU, DE, KO, JA, UZ, IT, ID, RO, AR, FA, CA, HU, ML, NL, TR, TE, EL, EO, BN, LV, GL, PL, GU, CEB, IA, KN, SH, MK, UR, SV, LA, JKA, MY, SU, CS, MN. This kind of data is essential for applications such as documentation generation and natural-language-to-code translation.
The dataset contains **358 programming languages**. The full list can be found [here](https://huggingface.co/datasets/bigcode/the-stack-dedup/blob/main/programming-languages.json).
### How to use it
```python
from datasets import load_dataset
# full dataset (3TB of data)
ds = load_dataset("bigcode/the-stack-dedup", split="train")
# specific language (e.g. Dockerfiles)
ds = load_dataset("bigcode/the-stack-dedup", data_dir="data/dockerfile", split="train")
# dataset streaming (will only download the data as needed)
ds = load_dataset("bigcode/the-stack-dedup", streaming=True, split="train")
for sample in iter(ds): print(sample["content"])
```
## Dataset Structure
### Data Instances
Each data instance corresponds to one file. The content of the file is in the `content` feature, and other features (`repository_name`, `licenses`, etc.) provide some metadata. Note that a given file can appear in several different repositories that satisfy our safe-license criterion. If that is the case, only the first – in alphabetical order -- of these repositories is shown for simplicity.
### Data Fields
- `content` (string): the content of the file.
- `size` (integer): size of the uncompressed file.
- `lang` (string): the programming language.
- `ext` (string): file extension
- `avg_line_length` (float): the average line-length of the file.
- `max_line_length` (integer): the maximum line-length of the file.
- `alphanum_fraction` (float): the fraction of characters in the file that are alphabetical or numerical characters.
- `hexsha` (string): unique git hash of file
- `max_{stars|forks|issues}_repo_path` (string): path to file in repo containing this file with maximum number of `{stars|forks|issues}`
- `max_{stars|forks|issues}_repo_name` (string): name of repo containing this file with maximum number of `{stars|forks|issues}`
- `max_{stars|forks|issues}_repo_head_hexsha` (string): hexsha of repository head
- `max_{stars|forks|issues}_repo_licenses` (string): licenses in repository
- `max_{stars|forks|issues}_count` (integer): number of `{stars|forks|issues}` in repository
- `max_{stars|forks|issues}_repo_{stars|forks|issues}_min_datetime` (string): first timestamp of a `{stars|forks|issues}` event
- `max_{stars|forks|issues}_repo_{stars|forks|issues}_max_datetime` (string): last timestamp of a `{stars|forks|issues}` event
### Data Splits
The dataset has no splits and all data is loaded as train split by default. If you want to setup a custom train-test split beware that dataset contains a lot of near-duplicates which can cause leakage into the test split.
## Dataset Creation
### Curation Rationale
One of the challenges faced by researchers working on code LLMs is the lack of openness and transparency around the development of these systems. Most prior works described the high-level data collection process but did not release the training data. It is therefore difficult for other researchers to fully reproduce these models and understand what kind of pre-training data leads to high-performing code LLMs. By releasing an open large-scale code dataset we hope to make training of code LLMs more reproducible. **This is the near-deduplicated version with 3TB data.**
### Source Data
#### Initial Data Collection and Normalization
220.92M active GitHub repository names were collected from the event archives published between January 1st, 2015 and March 31st, 2022 on [GHArchive](https://gharchive.org/). Only 137.36M of these repositories were public and accessible on GitHub – others were not accessible as they had been deleted by their owners. 51.76B files were downloaded from the public repositories on GitHub between November 2021 and June 2022. 5.28B files were unique. The uncompressed size of all stored files is 92.36TB.
The list of programming language extensions is taken from this [list](https://gist.github.com/ppisarczyk/43962d06686722d26d176fad46879d41) (also provided in Appendix C of the paper).
Near-deduplication was implemented in the pre-processing pipeline on top of exact deduplication. To find near-duplicates, MinHash with 256 permutations of all documents was computed in linear time. Locality Sensitive Hashing was used to find the clusters of duplicates. Jaccard Similarities were computed inside these clusters to remove any false positives and with a similarity threshold of 0.85. Roughly 40% of permissively licensed files were (near-)duplicates. See section 3 of the paper for further details.
The following are not stored:
- Files that cannot contribute to training code: binary, empty, could not be decoded
- Files larger than 1MB
- The excluded file extensions are listed in Appendix B of the paper.
##### License detection
Permissive licenses have minimal restrictions on how the software can be copied, modified, and redistributed. The full list of licenses can be found [here](https://huggingface.co/datasets/bigcode/the-stack-dedup/blob/main/licenses.json)
GHArchive contained the license information for approximately 12% of the collected repositories. For the remaining repositories, [go-license-detector](https://github.com/src-d/go-license-detector) was run to detect the most likely SPDX license identifier. The detector did not detect a license for ~81% of the repositories, in which case the repository was excluded from the dataset.
A file was in included in the safe license dataset if at least one of the repositories containing the file had a permissive license.
#### Who are the source language producers?
The source (code) language producers are users of GitHub that created unique repository names between January 1st, 2015, and March 31st, 2022.
### Personal and Sensitive Information
The released dataset may contain sensitive information such as emails, IP addresses, and API/ssh keys that have previously been published to public repositories on GitHub. Deduplication has helped to reduce the amount of sensitive data that may exist. In the event that the dataset contains personal information, researchers should only use public, non-personal information in support of conducting and publishing their [open-access](https://en.wikipedia.org/wiki/Open_access) research. Personal information should not be used for spamming purposes, including sending unsolicited emails or selling of personal information. Complaints, removal requests, and "do not contact" requests can be sent to contact@bigcode-project.org.
The PII pipeline for this dataset is still a work in progress (see this [issue](https://github.com/bigcode-project/admin/issues/9) for updates). Researchers that wish to contribute to the anonymization pipeline of the project can apply to join [here](https://www.bigcode-project.org/docs/about/join/). Developers with source code in the dataset can request to have it removed [here](https://www.bigcode-project.org/docs/about/ip/) (proof of code contribution is required).
### Opting out of The Stack
We are giving developers the ability to have their code removed from the dataset upon request. The process for submitting and enacting removal requests will keep evolving throughout the project as we receive feedback and build up more data governance tools.
You can check if your code is in The Stack with the following ["Am I In The Stack?" Space](https://huggingface.co/spaces/bigcode/in-the-stack). If you'd like to have your data removed from the dataset follow the [instructions on GitHub](https://github.com/bigcode-project/opt-out-v2).
## Considerations for Using the Data
### Social Impact of Dataset
The Stack is an output of the BigCode Project. BigCode aims to be responsible by design and by default. The project is conducted in the spirit of Open Science, focused on the responsible development of LLMs for code.
With the release of The Stack, we aim to increase access, reproducibility, and transparency of code LLMs in the research community. Work to de-risk and improve on the implementation of ethical best practices of code LLMs is conducted in various BigCode working groups. The Legal, Ethics, and Governance working group has explored topics such as licensing (including copyleft and the intended use of permissively licensed code), attribution of generated code to original code, rights to restrict processing, the inclusion of Personally Identifiable Information (PII), and risks of malicious code, among other topics. This work is ongoing as of October 25th, 2022.
We expect code LLMs to enable people from diverse backgrounds to write higher quality code and develop low-code applications. Mission-critical software could become easier to maintain as professional developers are guided by code-generating systems on how to write more robust and efficient code. While the social impact is intended to be positive, the increased accessibility of code LLMs comes with certain risks such as over-reliance on the generated code and long-term effects on the software development job market.
A broader impact analysis relating to Code LLMs can be found in section 7 of this [paper](https://arxiv.org/abs/2107.03374). An in-depth risk assessments for Code LLMs can be found in section 4 of this [paper](https://arxiv.org/abs/2207.14157).
### Discussion of Biases
The code collected from GitHub does not contain demographic information or proxy information about the demographics. However, it is not without risks,
as the comments within the code may contain harmful or offensive language, which could be learned by the models.
Widely adopted programming languages like C and Javascript are overrepresented compared to niche programming languages like Julia and Scala. Some programming languages such as SQL, Batchfile, TypeScript are less likely to be permissively licensed (4% vs the average 10%). This may result in a biased representation of those languages. Permissively licensed files also tend to be longer.
Roughly 40 natural languages are present in docstrings and comments with English being the most prevalent. In python files, it makes up ~96% of the dataset.
For further information on data analysis of the Stack, see this [repo](https://github.com/bigcode-project/bigcode-analysis).
### Other Known Limitations
One of the current limitations of The Stack is that scraped HTML for websites may not be compliant with Web Content Accessibility Guidelines ([WCAG](https://www.w3.org/WAI/standards-guidelines/wcag/)). This could have an impact on HTML-generated code that may introduce web accessibility issues.
The training dataset could contain malicious code and/or the model could be used to generate malware or ransomware.
To the best of our knowledge, all files contained in the dataset are licensed with one of the permissive licenses (see list in [Licensing information](#licensing-information)). The accuracy of license attribution is limited by the accuracy of GHArchive and go-license-detector. Any mistakes should be reported to BigCode Project for review and follow-up as needed.
## Additional Information
### Dataset Curators
1. Harm de Vries, ServiceNow Research, harm.devries@servicenow.com
2. Leandro von Werra, Hugging Face, leandro@huggingface.co
### Licensing Information
The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
The list of [SPDX license identifiers](https://spdx.org/licenses/) included in the dataset can be found [here](https://huggingface.co/datasets/bigcode/the-stack-dedup/blob/main/licenses.json).
### Citation Information
```
@article{Kocetkov2022TheStack,
title={The Stack: 3 TB of permissively licensed source code},
author={Kocetkov, Denis and Li, Raymond and Ben Allal, Loubna and Li, Jia and Mou,Chenghao and Muñoz Ferrandis, Carlos and Jernite, Yacine and Mitchell, Margaret and Hughes, Sean and Wolf, Thomas and Bahdanau, Dzmitry and von Werra, Leandro and de Vries, Harm},
journal={Preprint},
year={2022}
}
```
### Contributions
[More Information Needed]
## Terms of Use for The Stack
The Stack dataset is a collection of source code in over 300 programming languages. We ask that you read and acknowledge the following points before using the dataset:
1. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
2. The Stack is regularly updated to enact validated data removal requests. By clicking on "Access repository", you agree to update your own version of The Stack to the most recent usable version specified by the maintainers in [the following thread](https://huggingface.co/datasets/bigcode/the-stack/discussions/7). If you have questions about dataset versions and allowed uses, please also ask them in the dataset’s [community discussions](https://huggingface.co/datasets/bigcode/the-stack/discussions/new). We will also notify users via email when the latest usable version changes.
3. To host, share, or otherwise provide access to The Stack dataset, you must include these Terms of Use and require users to agree to it.
|
librarian-bots/paper-recommendations | ---
dataset_info:
features:
- name: paper_url
dtype: string
- name: comment
dtype: string
splits:
- name: train
num_bytes: 524820
num_examples: 476
download_size: 133619
dataset_size: 524820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "paper-recommendations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
communityai/HuggingFaceH4___OpenHermes-2.5-preferences-v0-deduped-150k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 294820112.08027345
num_examples: 150000
download_size: 147477277
dataset_size: 294820112.08027345
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nreHieW/SoccerNet_Field_Keypoints | ---
dataset_info:
features:
- name: image
dtype: image
- name: keypoints
sequence:
sequence: float64
- name: calibrated_keypoints
sequence:
sequence: float64
- name: id
dtype: int64
- name: is_bad
dtype: bool
splits:
- name: train
num_bytes: 2442258744.873
num_examples: 16249
- name: val
num_bytes: 473006553.365
num_examples: 3165
- name: test
num_bytes: 460148045.457
num_examples: 3097
download_size: 3510491092
dataset_size: 3375413343.6949997
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
lleticiasilvaa/b-mc2-sql-create-context-adaptado | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 53357903
num_examples: 78577
download_size: 19072158
dataset_size: 53357903
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-Merge-Slerp | ---
pretty_name: Evaluation run of RatanRohith/NeuralPizza-7B-Merge-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RatanRohith/NeuralPizza-7B-Merge-Slerp](https://huggingface.co/RatanRohith/NeuralPizza-7B-Merge-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-Merge-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T21:47:50.776941](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-Merge-Slerp/blob/main/results_2024-01-22T21-47-50.776941.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/RatanRohith/NeuralPizza-7B-Merge-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|arc:challenge|25_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|gsm8k|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hellaswag|10_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T21-47-50.776941.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T21-47-50.776941.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- '**/details_harness|winogrande|5_2024-01-22T21-47-50.776941.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T21-47-50.776941.parquet'
- config_name: results
data_files:
- split: 2024_01_22T21_47_50.776941
path:
- results_2024-01-22T21-47-50.776941.parquet
- split: latest
path:
- results_2024-01-22T21-47-50.776941.parquet
---
# Dataset Card for Evaluation run of RatanRohith/NeuralPizza-7B-Merge-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RatanRohith/NeuralPizza-7B-Merge-Slerp](https://huggingface.co/RatanRohith/NeuralPizza-7B-Merge-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-Merge-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T21:47:50.776941](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralPizza-7B-Merge-Slerp/blob/main/results_2024-01-22T21-47-50.776941.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
raoulduke420/mattdilworth | ---
license: creativeml-openrail-m
task_categories:
- image-classification
language:
- en
tags:
- man
pretty_name: Matt Dilworth
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
GEM/xmediasum | ---
annotations_creators:
- expert-generated
language:
- en
- zh
- de
language_creators:
- crowdsourced
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
pretty_name: xmediasum
size_categories:
- 10K<n<100K
source_datasets:
- original
tags: []
task_categories:
- summarization
task_ids: []
---
# Dataset Card for XMediaSum
### Dataset Summary
We present XMediaSum, a cross-lingual dialogue summarization dataset with 40K English(dialogues)->Chinese(summaries) and 40K English (dialogues)->German(summaries) samples. XMediaSum is created by manually translating the English summaries of MediaSum (a English monolingual dialogue summarization dataset) to both Chinese and German.
- Paper: [ClidSum: A Benchmark Dataset for Cross-Lingual Dialogue Summarization](https://aclanthology.org/2022.emnlp-main.526/) (EMNLP 2022)
- GitHub: https://github.com/krystalan/ClidSum
### Supported Task
- Cross-Lingual Summarization
- Cross-Lingual Dialogue Summarization
### Languages
- source language: English
- target language: Chinese and German
## Dataset Structure
### Data Instances
One example is given below in JSON format:
```json
{
"dialogue": "MADELELEINE BRAND, host: OK, here's some good news on the jobs front for both men and women. A new survey out today from the employment firm Manpower finds that about a quarter of employers will add jobs this summer. That's for adults, but for teenagers this summer's job market is shaping up to be the weakest in more than 50 years.\r\nALEX COHEN, host: So, how do you get your teenage kids not to spend the entire summer glued to the couch? You're about to get some tips from Michelle Singletary. She's Day to Day's personal finance contributor. Hi, Michelle!\r\nMICHELLE SINGLETARY: Hi!\r\nALEX COHEN, host: So why is the summer job market so hard for teens this year?\r\nMICHELLE SINGLETARY: Lot of things going on right now. We've got a tough economy. We've got a lot of college graduates going into the market. We have people who are losing their jobs and taking jobs that would traditionally go to teens, like in restaurants and retailers. And we have a lot of older people holding on to their jobs and not retiring because they can't afford to retire. And that puts teens at the end of the line when it comes to these types of jobs.\r\nALEX COHEN, host: So you've got a teenager at home, a little bit young for the working world just yet, but what would you say to a teenager who's out there hunting around for a job?\r\nMICHELLE SINGLETARY: If you absolutely need a job, keep looking. You know, obviously the types of jobs that teens tend to go for in retail, fast food, you know, they still need people. And oftentimes you know, listen, you may not get the job at the beginning of the summer, but hold on because in late summer, when some of those college students are going back and perhaps some of those people who lost their jobs are finding permanent positions with more pay, you might be able to still get that job. So don't give up, you may spend a month or month and a half without it, but go back to those retailers and those restaurants and those fast food places to see if they still need someone.\r\nALEX COHEN, host: And now I know parents like having the break from providing allowance. But, you know, is - are there reasons maybe not to push your teen towards taking a job?\r\nMICHELLE SINGLETARY: I think it absolutely is. In fact I think too many teens are working and they don't need to work. They're some who absolutely need, they're contributing to their household or they're putting money into their own college fund. But more often than not, what parents do is say you've got to get a job, and then the teens get the job and they spend all the money on clothes and you know videos and iPods and paying their cell phone bills because they don't need a cell phone anyway.\r\nALEX COHEN, host: So it's not going towards the college tuition at all.\r\nMICHELLE SINGLETARY: It is not. It's just disposable income that they're disposing of. And parents are not setting any limits and you know and then the kids get used to the fact that they're using all of their paycheck. That's another bad habit. Because they don't have to pay bills and all, all their income goes through you know this stuff.\r\nMICHELLE SINGLETARY: And when it comes time to get a real job, they're surprised they don't have enough money. And so you know what? You can wait to work. Instead, maybe they can spend the summer volunteering at a charitable organization or you know going back to school and boosting up their math skills or their English skills. We push the teens out into the market too soon, I think for some families.\r\nALEX COHEN, host: But now let's say your kid is working. What tips can parents provide in terms of holding on to that summer money?\r\nMICHELLE SINGLETARY: You know, before they get their job, they need to sit down with them and do a budget. So before they actually work and get that first paycheck I mean, you know, have them draw up a budge where the money is going. And you ought to have some requirements for some of their money. That's right, be a parent.\r\nMICHELLE SINGLETARY: So make them put some of it towards their college fund, if in fact they're headed for college. You know what? Make them put some away, I call it the tax fund, even though they may not have to pay taxes, but to pay for long-term things that they may want. You know, books once they get to college, or maybe they want to get a car, and they can actually pay cash for it, with some of these funds. Don't let them just go out and spend it on movies and stuff. You ought to set some guidelines - this is where you should put the money. And look at their budget.\r\nALEX COHEN, host: Day to Day's personal finance contributor Michelle Singletary. Thank you, Michelle!\r\nMICHELLE SINGLETARY: You're welcome.\r\nALEX COHEN, host: Stay with us. NPR's Day to Day continues.",
"summary": "The tight job market could be bad news for teens seeking summer work. If your teen does find a job, will he or she know how to manage those paychecks? Our personal finance contributor talks with Alex Cohen about ways to help teens find a job.",
"summary_de": "Der angespannte Arbeitsmarkt könnte für Jugendliche, die Sommerarbeit suchen, eine schlechte Nachricht sein. Wenn Ihr Teenager einen Job findet, wird er oder sie wissen, wie er mit diesen Gehaltsschecks umgeht? Unser Mitarbeiter für persönliche Finanzen spricht mit Alex Cohen darüber, wie Teenager bei der Jobsuche unterstützt werden können.",
"summary_zh": "紧张的就业市场对寻找暑期工作的青少年来说可能是个坏消息。如果你的孩子找到了一份工作,他/她懂得怎么管理这些薪水吗?我们的个人理财撰稿人与亚历克斯·科恩谈论如何帮助青少年找到工作。"
},
```
### Data Fields
- 'dialogue': An English dialogue
- 'summary': the original English summary of the corresponding dialogue (provided by MediaSum)
- 'summary_de': the human-translated German summary
- 'summary_zh': the human-translated Chinese summary
### Data Splits
- training set: 20K samples
- validation set: 10K samples
- testing set: 10K samples
## Dataset Creation
Please refer to [our paper](https://aclanthology.org/2022.emnlp-main.526/) for more details.
## Considerations for Using the Data
Please refer to [our paper](https://aclanthology.org/2022.emnlp-main.526/) for more details.
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/krystalan/ClidSum)
### Licensing Information
License: CC BY-NC-SA 4.0
### Citation Information
```
@inproceedings{wang-etal-2022-clidsum,
title = "{C}lid{S}um: A Benchmark Dataset for Cross-Lingual Dialogue Summarization",
author = "Wang, Jiaan and
Meng, Fandong and
Lu, Ziyao and
Zheng, Duo and
Li, Zhixu and
Qu, Jianfeng and
Zhou, Jie",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.526",
pages = "7716--7729",
abstract = "We present ClidSum, a benchmark dataset towards building cross-lingual summarization systems on dialogue documents. It consists of 67k+ dialogue documents and 112k+ annotated summaries in different target languages. Based on the proposed ClidSum, we introduce two benchmark settings for supervised and semi-supervised scenarios, respectively. We then build various baseline systems in different paradigms (pipeline and end-to-end) and conduct extensive experiments on ClidSum to provide deeper analyses. Furthermore, we propose mDialBART which extends mBART via further pre-training, where the multiple objectives help the pre-trained model capture the structural characteristics as well as key content in dialogues and the transformation from source to the target language. Experimental results show the superiority of mDialBART, as an end-to-end model, outperforms strong pipeline models on ClidSum. Finally, we discuss specific challenges that current approaches faced with this task and give multiple promising directions for future research. We have released the dataset and code at https://github.com/krystalan/ClidSum.",
}
```
### Contributions
Thanks to [@krystalan](https://github.com/krystalan) for adding this dataset. |
yzhuang/autotree_automl_10000_credit_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 236440000
num_examples: 10000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 126879367
dataset_size: 472880000
---
# Dataset Card for "autotree_automl_10000_credit_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HebArabNlpProject/HebrewSentiment | ---
license: cc-by-4.0
configs:
- config_name: default
data_files:
- split: train
path: train/data.jsonl
- split: test
path: test/test.jsonl
task_categories:
- text-classification
language:
- he
size_categories:
- 10K<n<100K
---
# HebrewSentiment - A Sentiment-Analysis Dataset in Hebrew
## Summary
HebrewSentiment is a Hebrew dataset for the sentiment analysis task.
## Introduction
This dataset was constructed via [To Fill In].
## Dataset Statistics
The table below shows the number of examples from each category in each of the splits:
| split | total | positive | negative | neutral |
|-------|----------|----------|----------|---------|
| train | 39,135 | 8,968 | 7,669 | 22,498 |
| test | 2,170 | 503 | 433 | 1,234 |
## Dataset Description
Each row in the dataset contains the following fields:
- **id**: A unique identifier for that training examples
- **text**: The textual content of the input sentence
- **tag_ids**: The label of the example (`Neutral`/`Positive`/`Negative`)
- **task_name**: [To fill in]
- **campaign_id**: [To fill in]
- **annotator_agreement_strength**: [To fill in]
- **survey_name**: [To fill in]
- **industry**: [To fill in]
- **type**: [To fill in]
## Models and Comparisons
In collaboration with [DICTA](https://dicta.org.il/) we trained a model on this dataset and are happy to release it to the public: [DictaBERT-Sentiment](https://huggingface.co/dicta-il/dictabert-sentiment).
In addition, we compared the performance of the model to the previous existing sentiment dataset - [Hebrew-Sentiment-Data from OnlpLab](https://github.com/OnlpLab/Hebrew-Sentiment-Data).
We fine-tuned [dictabert](https://huggingface.co/dicta-il/dictabert) 3 times - once on the OnlpLab dataset, once on this dataset, and once on both datasets together and the results can be seen in the table below:
| Training Corpus: | OnlpLab | | | | | HebrewSentiment| | | | |
|------------------|------|----------------|------|------|--------|--------------|------|------|---|---|
| | Accuracy | Macro F1 | F1 Positive | F1 Negative | F1 Neutral | Accuracy | Macro F1 | F1 Positive | F1 Negative | F1 Neutral |
| OnlpLab+HebrewSentiment | 87 | 61.7 | 93.2 | 74.6 | 17.4 | 83.9 | 82.7 | 79.8 | 81.8 | 86.4 |
| OnlpLab | 88.2 | 63.3 | 93.8 | 72.1 | 24 | 41.3 | 42.2 | 48.1 | 56.3 | 22.2 |
| HebrewSentiment | 69.9 | 51.7 | 82.2 | 62.9 | 10.2 | 84.4 | 83.2 | 81 | 82.1 | 86.6 |
## Contributors
[To fill in]
Contributors: [To fill in]
## Acknowledgments
We would like to express our gratitude to [To fill in] |
llm-aes/pandalm-annotated-full | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: human_label
dtype: int64
- name: worker_id
dtype: string
- name: llm_label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2482896
num_examples: 7137
download_size: 111103
dataset_size: 2482896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Einni/Milio | ---
license: openrail
---
|
zaanind/qasimplesi | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 17208
num_examples: 28
download_size: 9011
dataset_size: 17208
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qasimplesi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pccl-org/formal-logic-simple-order-multi-token-dynamic-objects-paired-relationship-0-10000 | ---
dataset_info:
features:
- name: greater_than
sequence: int64
- name: less_than
sequence: int64
- name: paired_example
sequence:
sequence:
sequence: int64
- name: correct_example
sequence:
sequence: int64
- name: incorrect_example
sequence:
sequence: int64
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 1381150312
num_examples: 4865250
download_size: 470462561
dataset_size: 1381150312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cointegrated/ru-paraphrase-NMT-Leipzig | ---
annotations_creators:
- no-annotation
language_creators:
- machine-generated
language:
- ru
license:
- cc-by-4.0
multilinguality:
- translation
size_categories:
- 100K<n<1M
source_datasets:
- extended|other
task_categories:
- text-generation
pretty_name: ru-paraphrase-NMT-Leipzig
tags:
- conditional-text-generation
- paraphrase-generation
- paraphrase
---
# Dataset Card for **cointegrated/ru-paraphrase-NMT-Leipzig**
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Paper:** https://habr.com/ru/post/564916/
- **Point of Contact:** [@cointegrated](https://huggingface.co/cointegrated)
### Dataset Summary
The dataset contains 1 million Russian sentences and their automatically generated paraphrases.
It was created by David Dale ([@cointegrated](https://huggingface.co/cointegrated)) by translating the `rus-ru_web-public_2019_1M` corpus from [the Leipzig collection](https://wortschatz.uni-leipzig.de/en/download) into English and back into Russian. A fraction of the resulting paraphrases are invalid, and should be filtered out.
The blogpost ["Перефразирование русских текстов: корпуса, модели, метрики"](https://habr.com/ru/post/564916/) provides a detailed description of the dataset and its properties.
The dataset can be loaded with the following code:
```Python
import datasets
data = datasets.load_dataset(
'cointegrated/ru-paraphrase-NMT-Leipzig',
data_files={"train": "train.csv","val": "val.csv","test": "test.csv"},
)
```
Its output should look like
```
DatasetDict({
train: Dataset({
features: ['idx', 'original', 'en', 'ru', 'chrf_sim', 'labse_sim'],
num_rows: 980000
})
val: Dataset({
features: ['idx', 'original', 'en', 'ru', 'chrf_sim', 'labse_sim'],
num_rows: 10000
})
test: Dataset({
features: ['idx', 'original', 'en', 'ru', 'chrf_sim', 'labse_sim'],
num_rows: 10000
})
})
```
### Supported Tasks and Leaderboards
The dataset can be used to train and validate models for paraphrase generation or (if negative sampling is used) for paraphrase detection.
### Languages
Russian (main), English (auxilliary).
## Dataset Structure
### Data Instances
Data instances look like
```
{
"labse_sim": 0.93502015,
"chrf_sim": 0.4946451012684782,
"idx": 646422,
"ru": "О перспективах развития новых медиа-технологий в РФ расскажут на медиафоруме Енисея.",
"original": "Перспективы развития новых медиатехнологий в Российской Федерации обсудят участники медиафорума «Енисей.",
"en": "Prospects for the development of new media technologies in the Russian Federation will be discussed at the Yenisey Media Forum."
}
```
Where `original` is the original sentence, and `ru` is its machine-generated paraphrase.
### Data Fields
- `idx`: id of the instance in the original corpus
- `original`: the original sentence
- `en`: automatic translation of `original` to English
- `ru`: automatic translation of `en` back to Russian, i.e. a paraphrase of `original`
- `chrf_sim`: [ChrF++](https://huggingface.co/metrics/chrf) similarity of `original` and `ru`
- `labse_sim`: cosine similarity of [LaBSE](https://huggingface.co/cointegrated/LaBSE-en-ru) embedings of `original` and `ru`
- `forward_entailment`: predicted probability that `original` entails `ru`
- `backward_entailment`: predicted probability that `ru` entails `original`
- `p_good`: predicted probability that `ru` and `original` have equivalent meaning
### Data Splits
Train – 980K, validation – 10K, test – 10K. The splits were generated randomly.
## Dataset Creation
### Curation Rationale
There are other Russian paraphrase corpora, but they have major drawbacks:
- The best known [corpus from paraphraser.ru 2016 contest](http://paraphraser.ru/download/) is rather small and covers only the News domain.
- [Opusparcus](https://huggingface.co/datasets/GEM/opusparcus), [ParaPhraserPlus](http://paraphraser.ru/download/), and [corpora of Tamara Zhordanija](https://github.com/tamriq/paraphrase) are noisy, i.e. a large proportion of sentence pairs in them have substantial difference in meaning.
- The Russian part of [TaPaCo](https://huggingface.co/datasets/tapaco) has very high lexical overlap in the sentence pairs; in other words, their paraphrases are not diverse enough.
The current corpus is generated with a dual objective: the parphrases should be semantically as close as possible to the original sentences, while being lexically different from them. Back-translation with restricted vocabulary seems to achieve this goal often enough.
### Source Data
#### Initial Data Collection and Normalization
The `rus-ru_web-public_2019_1M` corpus from [the Leipzig collection](https://wortschatz.uni-leipzig.de/en/download) as is.
The process of its creation is described [in this paper](http://www.lrec-conf.org/proceedings/lrec2012/pdf/327_Paper.pdf):
D. Goldhahn, T. Eckart & U. Quasthoff: Building Large Monolingual Dictionaries at the Leipzig Corpora Collection: From 100 to 200 Languages.
In: *Proceedings of the 8th International Language Resources and Evaluation (LREC'12), 2012*.
#### Automatic paraphrasing
The paraphrasing was carried out by translating the original sentence to English and then back to Russian.
The models [facebook/wmt19-ru-en](https://huggingface.co/facebook/wmt19-ru-en) and [facebook/wmt19-en-ru](https://huggingface.co/facebook/wmt19-en-ru) were used for translation.
To ensure that the back-translated texts are not identical to the original texts, the final decoder was prohibited to use the token n-grams from the original texts.
The code below implements the paraphrasing function.
```python
import torch
from transformers import FSMTModel, FSMTTokenizer, FSMTForConditionalGeneration
tokenizer = FSMTTokenizer.from_pretrained("facebook/wmt19-en-ru")
model = FSMTForConditionalGeneration.from_pretrained("facebook/wmt19-en-ru")
inverse_tokenizer = FSMTTokenizer.from_pretrained("facebook/wmt19-ru-en")
inverse_model = FSMTForConditionalGeneration.from_pretrained("facebook/wmt19-ru-en")
model.cuda();
inverse_model.cuda();
def paraphrase(text, gram=4, num_beams=5, **kwargs):
""" Generate a paraphrase using back translation.
Parameter `gram` denotes size of token n-grams of the original sentence that cannot appear in the paraphrase.
"""
input_ids = inverse_tokenizer.encode(text, return_tensors="pt")
with torch.no_grad():
outputs = inverse_model.generate(input_ids.to(inverse_model.device), num_beams=num_beams, **kwargs)
other_lang = inverse_tokenizer.decode(outputs[0], skip_special_tokens=True)
# print(other_lang)
input_ids = input_ids[0, :-1].tolist()
bad_word_ids = [input_ids[i:(i+gram)] for i in range(len(input_ids)-gram)]
input_ids = tokenizer.encode(other_lang, return_tensors="pt")
with torch.no_grad():
outputs = model.generate(input_ids.to(model.device), num_beams=num_beams, bad_words_ids=bad_word_ids, **kwargs)
decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
return decoded
```
The corpus was created by running the above `paraphrase` function on the original sentences with parameters `gram=3, num_beams=5, repetition_penalty=3.14, no_repeat_ngram_size=6`.
### Annotations
#### Annotation process
The dataset was annotated by several automatic metrics:
- [ChrF++](https://huggingface.co/metrics/chrf) between `original` and `ru` sentences;
- cosine similarity between [LaBSE](https://huggingface.co/cointegrated/LaBSE-en-ru) embeddings of these sentences;
- forward and backward entailment probabilites predictd by the [rubert-base-cased-nli-twoway](https://huggingface.co/cointegrated/rubert-base-cased-nli-twoway) model;
- `p_good`, a metric aggregating the four metrics above into a single number. It is obtained with a logistic regression trained on 100 randomly chosen from the train set and manually labelled sentence pairs.
#### Who are the annotators?
Human annotation was involved only for a small subset used to train the model for `p_good`. It was conduced by the dataset author, @cointegrated.
### Personal and Sensitive Information
The dataset is not known to contain any personal or sensitive information.
The sources and processes of original data collection are described at https://wortschatz.uni-leipzig.de/en/download.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset may enable creation for paraphrasing systems that can be used both for "good" purposes (such as assisting writers or augmenting text datasets), and for "bad" purposes (such as disguising plagiarism). The authors are not responsible for any uses of the dataset.
### Discussion of Biases
The dataset may inherit some of the biases of [the underlying Leipzig web corpus](https://wortschatz.uni-leipzig.de/en/download) or the neural machine translation models ([1](https://huggingface.co/facebook/wmt19-ru-en), [2](https://huggingface.co/facebook/wmt19-en-ru)) with which it was generated.
### Other Known Limitations
Most of the paraphrases in the dataset are valid (by a rough estimante, at least 80%). However, in some sentence pairs there are faults:
- Named entities are often spelled in different ways (e.g. `"Джейкоб" -> "Яков") or even replaced with other entities (e.g. `"Оймякон" -> "Оймянск" or `"Верхоянск" -> "Тольятти"`).
- Sometimes the meaning of words or phrases changes signigicantly, e.g. `"полустанок" -> "полумашина"`, or `"были по колено в грязи" -> "лежали на коленях в иле"`.
- Sometimes the syntax is changed in a meaning-altering way, e.g. `"Интеллектуальное преимущество Вавилова и его соратников над демагогами из рядов сторонников новой агробиологии разительно очевидно." -> "Интеллектуал Вавилов и его приспешники в новой аграрной биологии явно превзошли демогогов."`.
- Grammatical properties that are present in Russian morphology but absent in English, such as gender, are often lost, e.g. `"Я не хотела тебя пугать" -> "Я не хотел пугать вас"`.
The field `labse_sim` reflects semantic similarity between the sentences, and it can be used to filter out at least some poor paraphrases.
## Additional Information
### Dataset Curators
The dataset was created by [David Dale](https://daviddale.ru/en), a.k.a. [@cointegrated](https://huggingface.co/cointegrated).
### Licensing Information
This corpus, as well as the original Leipzig corpora, are licensed under [CC BY](http://creativecommons.org/licenses/by/4.0/).
### Citation Information
[This blog post](https://habr.com/ru/post/564916/) can be cited:
```
@misc{dale_paraphrasing_2021,
author = "Dale, David",
title = "Перефразирование русских текстов: корпуса, модели, метрики",
editor = "habr.com",
url = "https://habr.com/ru/post/564916/",
month = {June},
year = {2021},
note = {[Online; posted 28-June-2021]},
}
```
### Contributions
Thanks to [@avidale](https://github.com/avidale) for adding this dataset. |
sam-mosaic/chat-v2 | ---
language: en
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 1053541716.4621352
num_examples: 306305
- name: test
num_bytes: 20265459.694286585
num_examples: 5339
download_size: 505718158
dataset_size: 1073807176.1564217
---
# Dataset Card for "chat_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xtr8/john | ---
license: other
---
|
ccao/test2 | ---
license: mit
---
|
AdapterOcean/med_alpaca_standardized_cluster_8_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 25841419
num_examples: 14665
download_size: 13284008
dataset_size: 25841419
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_8_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Here2DoItWell/TestingSC2 | ---
license: mit
---
|
jakartaresearch/poem-tweets | ---
annotations_creators:
- no-annotation
language:
- id
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: poem_tweets
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- poem
- tweets
- twitter
- indonesian
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for Poem Tweets
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data are from Twitter. The purpose of this data is to create text generation model for short text and make sure they are all coherence and rhythmic
### Supported Tasks and Leaderboards
- Text Generation
- Language Model
### Languages
Indonesian
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@andreaschandra](https://github.com/andreaschandra) for adding this dataset. |
Multimodal-Fatima/DTD_parition1_test_facebook_opt_350m_Attributes_Caption_ns_1880_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 92259986.0
num_examples: 1880
- name: fewshot_3_bs_16
num_bytes: 93272910.0
num_examples: 1880
download_size: 91287158
dataset_size: 185532896.0
---
# Dataset Card for "DTD_parition1_test_facebook_opt_350m_Attributes_Caption_ns_1880_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fraternitas/ElektraGoFAQs-aug-text-en | ---
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
---
This is the ElektraGo FAQs dataset but after applying data augmentation.
This repo has 3 versions of the dataset:
**1. Raw Dataset with data Augmentation**
- ElektraGo_FAQs_Augmented.csv
- ElektraGo_FAQs-Augmented-en.json
**2. Dataset in Llama2 prompt format**
- ElektraGo_FAQs-Text-en.json
- ElektraGo_FAQs_Text.csv
**3. Dataset in Llama2 prompt format with system prompts**
- ElektraGo_FAQs_Text_SystemPrompts.csv
|
Gin1234/Eng-Rongmei | ---
license: apache-2.0
task_categories:
- translation
language:
- en
tags:
- code
- webdataset
pretty_name: English to Rongmei
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
000111000 |
CyberHarem/ruukoto_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ruukoto (Touhou)
This is the dataset of ruukoto (Touhou), containing 40 images and their tags.
The core tags of this character are `green_hair, maid_headdress, short_hair, bow, blue_eyes, red_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 27.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruukoto_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 19.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruukoto_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 64 | 32.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruukoto_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 25.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruukoto_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 64 | 43.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruukoto_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ruukoto_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, maid, solo, apron, smile, blush, dress, open_mouth |
| 1 | 22 |  |  |  |  |  | blue_dress, 1girl, puffy_short_sleeves, solo, frills, maid_apron, looking_at_viewer, smile, holding, white_apron, bangs, open_mouth, simple_background, full_body, red_bowtie, waist_apron, broom, mary_janes, mop, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | maid | solo | apron | smile | blush | dress | open_mouth | blue_dress | puffy_short_sleeves | frills | maid_apron | looking_at_viewer | holding | white_apron | bangs | simple_background | full_body | red_bowtie | waist_apron | broom | mary_janes | mop | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:--------|:--------|:--------|:--------|:-------------|:-------------|:----------------------|:---------|:-------------|:--------------------|:----------|:--------------|:--------|:--------------------|:------------|:-------------|:--------------|:--------|:-------------|:------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 22 |  |  |  |  |  | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
freddyaboulton/chatinterface_callback | ---
configs:
- config_name: default
data_files:
- split: train
path: "**/*.jsonl"
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AmJr/deputies_questions_XIV_3000 | ---
license: apache-2.0
---
|
nayohan/022_summary_report_20percent | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: summary_extend
dtype: string
splits:
- name: train
num_bytes: 221543313
num_examples: 82581
download_size: 127611756
dataset_size: 221543313
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tilos/cantonese_processed_daily | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3899650216
num_examples: 4060
download_size: 623179139
dataset_size: 3899650216
---
# Dataset Card for "cantonese_processed_daily"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/quirky_math_bob_grader_last_1.0e_0.0p_finetuning | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: statement
dtype: string
- name: choices
sequence: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
- name: true_label
dtype: bool
splits:
- name: train
num_bytes: 11540623
num_examples: 200000
- name: validation
num_bytes: 1159427
num_examples: 20000
- name: test
num_bytes: 1159757
num_examples: 20000
download_size: 3315827
dataset_size: 13859807
---
# Dataset Card for "quirky_math_bob_grader_last_1.0e_0.0p_finetuning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
5CD-AI/Vietnamese-TriviaQA-RC-gg-translated | ---
task_categories:
- question-answering
language:
- vi
- en
field: Data
--- |
Lollitor/CASF | ---
dataset_info:
features:
- name: '#code'
dtype: string
- name: inputs
dtype: string
splits:
- name: train
num_bytes: 310419
num_examples: 285
download_size: 110166
dataset_size: 310419
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CASF"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AyoubChLin/FFHQ | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-squad_v2-c78baf7d-13885910 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: nlpconnect/deberta-v3-xsmall-squad2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: nlpconnect/deberta-v3-xsmall-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ankur310974](https://huggingface.co/ankur310974) for evaluating this model. |
Glac1er/thordinwpn | ---
license: unknown
---
|
harshithvh/alpaca_format1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 412117
num_examples: 251
download_size: 88777
dataset_size: 412117
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
taeseokyi/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 386968
num_examples: 100
download_size: 169642
dataset_size: 386968
---
# Dataset Card for "github-issues"
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
paperswithcode_id: null
pretty_name: Hugging Face GitHub Issues
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- bio
- paper
task_categories:
- text-classification
- table-to-text
task_ids:
- multi-class-classification
- sentiment-classification
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
saarus72/pikabu_text_norm | ---
license: apache-2.0
task_categories:
- text-generation
language:
- ru
size_categories:
- 1M<n<10M
---
Texts inverse normalized obtained from [pikabu](https://huggingface.co/datasets/IlyaGusev/pikabu) dataset.
Normalized using [these notebooks](https://github.com/saarus72/text_normalization) for a personal [russian normalization model](https://huggingface.co/saarus72/russian_text_normalizer) (avaliable on [HF Space](https://huggingface.co/spaces/saarus72/russian-text-normalization) as well).
All put into single `jsonl` file with lines like (beautified):
```json
{
"tn": "\\- Ну как так то? У нас в Норильске при минус сорока градусах в буран люди не замерзают, а у вас при минус десяти без ветра человек насмерть замёрз?",
"itn": "\\- Ну как так то? У нас в Норильске при минус 40 градусах в буран люди не замерзают, а у вас при минус 10 без ветра человек насмерть замёрз?",
"orig_index": 7178627,
"text_index": 1,
"replaces": [
{
"text_from": "\\- Ну как так то? У нас в Норильске при минус ",
"text_to": "\\- Ну как так то? У нас в Норильске при минус "
},
{
"text_from": "40",
"text_to": "сорока"
},
{
"text_from": " градусах в буран люди не замерзают, а у вас при минус ",
"text_to": " градусах в буран люди не замерзают, а у вас при минус "
},
{
"text_from": "10",
"text_to": "десяти"
},
{
"text_from": " без ветра человек насмерть замёрз?",
"text_to": " без ветра человек насмерть замёрз?"
}
]
}
```
|
mrfakename/Pure-Dove-ShareGPT | ---
license: apache-2.0
---
It's https://huggingface.co/datasets/LDJnr/Pure-Dove but in the ShareGPT format!
Easily load in Axolotl by setting the type to ShareGPT
Convert @LDJnr datasets to ShareGPT using this script:
```python
import json
with open('ds.jsonl') as f:
lines = f.read().strip().splitlines()
cvs = []
for line in lines:
convos = json.loads(line)['conversation']
cv = []
for convo in convos:
cv.append({
'from': 'human',
'value': convo['input']
})
cv.append({
'from': 'gpt',
'value': convo['output']
})
cvs.append({
'conversations': cv
})
with open('outputs.json', 'w') as f:
f.write(json.dumps(cvs))
``` |
somen-1001/bitqit-test-dataset | ---
task_categories:
- text-generation
language:
- en
pretty_name: bitqit-dataset
size_categories:
- n<1K
--- |
UnknownBot/Tobys-Lively-Tunes | ---
license: gpl-3.0
---
|
one-sec-cv12/chunk_78 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23971371696.875
num_examples: 249577
download_size: 22358615413
dataset_size: 23971371696.875
---
# Dataset Card for "chunk_78"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gustacaste/gustacaste-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gustacaste-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LucasThil/miniwob_snippets_refs_onehot | ---
dataset_info:
features:
- name: episodes
dtype: string
- name: refs
dtype: int64
- name: click
dtype: int64
- name: dblclick
dtype: int64
- name: keydown
dtype: int64
- name: keypress
dtype: int64
- name: keyup
dtype: int64
- name: mousedown
dtype: int64
- name: mouseup
dtype: int64
- name: scroll
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
- name: '72'
dtype: int64
- name: '73'
dtype: int64
- name: '74'
dtype: int64
- name: '75'
dtype: int64
- name: '76'
dtype: int64
- name: '77'
dtype: int64
- name: '78'
dtype: int64
- name: '79'
dtype: int64
- name: '80'
dtype: int64
- name: '81'
dtype: int64
- name: '82'
dtype: int64
- name: '83'
dtype: int64
- name: '84'
dtype: int64
- name: '85'
dtype: int64
- name: '86'
dtype: int64
- name: '87'
dtype: int64
- name: '88'
dtype: int64
- name: '89'
dtype: int64
- name: '90'
dtype: int64
- name: '91'
dtype: int64
- name: '92'
dtype: int64
- name: '93'
dtype: int64
- name: '94'
dtype: int64
- name: '95'
dtype: int64
- name: '96'
dtype: int64
- name: '97'
dtype: int64
- name: '98'
dtype: int64
- name: '99'
dtype: int64
- name: '100'
dtype: int64
- name: '101'
dtype: int64
- name: '102'
dtype: int64
- name: '103'
dtype: int64
- name: '104'
dtype: int64
- name: '105'
dtype: int64
- name: '106'
dtype: int64
- name: '107'
dtype: int64
- name: '108'
dtype: int64
- name: '109'
dtype: int64
- name: '110'
dtype: int64
- name: '111'
dtype: int64
- name: '112'
dtype: int64
- name: '113'
dtype: int64
- name: '114'
dtype: int64
- name: '115'
dtype: int64
- name: '116'
dtype: int64
- name: '117'
dtype: int64
- name: '118'
dtype: int64
- name: '119'
dtype: int64
- name: '120'
dtype: int64
- name: '121'
dtype: int64
- name: '122'
dtype: int64
- name: '123'
dtype: int64
- name: '124'
dtype: int64
- name: '125'
dtype: int64
- name: '126'
dtype: int64
- name: '127'
dtype: int64
- name: '129'
dtype: int64
- name: '130'
dtype: int64
- name: '131'
dtype: int64
- name: '132'
dtype: int64
- name: '133'
dtype: int64
- name: '134'
dtype: int64
- name: '135'
dtype: int64
- name: '136'
dtype: int64
- name: '137'
dtype: int64
- name: '138'
dtype: int64
- name: '139'
dtype: int64
- name: '140'
dtype: int64
- name: '142'
dtype: int64
- name: '143'
dtype: int64
- name: '144'
dtype: int64
- name: '145'
dtype: int64
- name: '146'
dtype: int64
- name: '147'
dtype: int64
- name: '148'
dtype: int64
- name: '149'
dtype: int64
- name: '150'
dtype: int64
- name: '151'
dtype: int64
- name: '152'
dtype: int64
- name: '153'
dtype: int64
- name: '154'
dtype: int64
- name: '155'
dtype: int64
- name: '156'
dtype: int64
- name: '157'
dtype: int64
- name: '160'
dtype: int64
- name: '166'
dtype: int64
- name: '180'
dtype: int64
- name: '181'
dtype: int64
splits:
- name: train
num_bytes: 1188943765
num_examples: 464060
- name: test
num_bytes: 148857112
num_examples: 58068
- name: validate
num_bytes: 148842043
num_examples: 57976
download_size: 150978553
dataset_size: 1486642920
---
# Dataset Card for "miniwob_snippets_refs_onehot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yihouxiang/UHD4K | ---
license: mit
---
|
SummerSigh/PolicyData | ---
license: apache-2.0
---
|
jan-hq/finance_benchmark_mcq | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: response
dtype: string
- name: options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: golden_key
dtype: string
splits:
- name: train
num_bytes: 22833107.145736683
num_examples: 5074
download_size: 10322869
dataset_size: 22833107.145736683
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
djhugg/06myvozmasc06 | ---
license: openrail
---
|
abacusai/MetaMath_DPO_FewShot | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1211199708
num_examples: 393999
- name: eval
num_bytes: 3029624
num_examples: 1000
download_size: 561354683
dataset_size: 1214229332
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---

# Dataset Card for "MetaMath_DPO_FewShot"
GSM8K \citep{cobbe2021training} is a dataset of diverse grade school maths word problems, which has been commonly adopted as a measure of the math and reasoning skills of LLMs.
The [MetaMath](https://meta-math.github.io/) dataset is an extension of the training set of GSM8K using data augmentation.
It is partitioned into queries and responses, where the query is a question involving mathematical calculation or reasoning, and the response is a logical series of steps and calculations that culminate in a final answer.
To construct our paired-preference version of MetaMath, we take the queries as prompts x and the responses as the preferred completions y_w. We create y_l by modifying the response by randomly corrupting one of the results of an intermediate calculation so that it is incorrect; however, we leave the answer untouched.
An example is as follows:
x: "What is the total cost of purchasing equipment for all sixteen players on the football team, considering that each player requires a $25 jersey, a $15.20 pair of shorts, and a pair of socks priced at $6.80?"
y_w: "Each player requires a $25 jersey, a $15.20 pair of shorts, and a pair of socks priced at $6.80. So the total cost for each player is $25 + $15.20 + $6.80 = 47. Since there are sixteen players on the football team, the total cost for all of them is 16 * $47 = $752. \#\#\#\# 752 The answer is: 752"
y_l: "Each player requires a $25 jersey, a $15.20 pair of shorts, and a pair of socks priced at $6.80. So the total cost for each player is $25 + $15.20 + $6.80 = 52. Since there are sixteen players on the football team, the total cost for all of them is 16 * $47 = $752. \#\#\#\# 752 The answer is: 752"
Our motivation in building this dataset is to align models towards being precise in intermediate calculations.
This dataset has low edit distance -- the normalised edit distance is approximately 6.5%.
The dataset is meant to be used to fine-tune LLMs (which have already undergone SFT) using the DPOP loss function. We used this dataset to create the [Smaug series of models](https://github.com/abacusai/smaug).
The dataset contains 393,999 training examples and 1,000 evaluation examples.
See more details in the [datasheet](https://github.com/abacusai/smaug/blob/main/datasheet.md), and in our paper: https://arxiv.org/abs/2402.13228. |
open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2 | ---
pretty_name: Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [davzoku/frankencria-llama2-12.5b-v1.3-m.2](https://huggingface.co/davzoku/frankencria-llama2-12.5b-v1.3-m.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T15:39:16.700250](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2/blob/main/results_2024-02-14T15-39-16.700250.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4617722987057413,\n\
\ \"acc_stderr\": 0.03442751213597903,\n \"acc_norm\": 0.4686971200766007,\n\
\ \"acc_norm_stderr\": 0.03527698452163269,\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.5030678363563933,\n\
\ \"mc2_stderr\": 0.01589753382807047\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5110921501706485,\n \"acc_stderr\": 0.01460779491401305,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284743\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6093407687711612,\n\
\ \"acc_stderr\": 0.0048690101522807505,\n \"acc_norm\": 0.7916749651463851,\n\
\ \"acc_norm_stderr\": 0.004052804959005537\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.36416184971098264,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.535483870967742,\n \"acc_stderr\": 0.028372287797962935,\n \"\
acc_norm\": 0.535483870967742,\n \"acc_norm_stderr\": 0.028372287797962935\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868407,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868407\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.035212249088415845,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.035212249088415845\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.41794871794871796,\n \"acc_stderr\": 0.02500732988246122,\n\
\ \"acc_norm\": 0.41794871794871796,\n \"acc_norm_stderr\": 0.02500732988246122\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829118,\n \"\
acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829118\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953426,\n \"\
acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953426\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.031450686007448596,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.031450686007448596\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6495726495726496,\n\
\ \"acc_stderr\": 0.0312561082442188,\n \"acc_norm\": 0.6495726495726496,\n\
\ \"acc_norm_stderr\": 0.0312561082442188\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6513409961685823,\n\
\ \"acc_stderr\": 0.017041243143490974,\n \"acc_norm\": 0.6513409961685823,\n\
\ \"acc_norm_stderr\": 0.017041243143490974\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n\
\ \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21675977653631284,\n\
\ \"acc_stderr\": 0.01378059848644335,\n \"acc_norm\": 0.21675977653631284,\n\
\ \"acc_norm_stderr\": 0.01378059848644335\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089782,\n\
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n\
\ \"acc_stderr\": 0.02821768355665232,\n \"acc_norm\": 0.5562700964630225,\n\
\ \"acc_norm_stderr\": 0.02821768355665232\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347666,\n\
\ \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347666\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n\
\ \"acc_stderr\": 0.012223623364044037,\n \"acc_norm\": 0.35528031290743156,\n\
\ \"acc_norm_stderr\": 0.012223623364044037\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329387,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329387\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.45751633986928103,\n \"acc_stderr\": 0.020154685712590888,\n \
\ \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.020154685712590888\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4448979591836735,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.4448979591836735,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5870646766169154,\n\
\ \"acc_stderr\": 0.034815208033673474,\n \"acc_norm\": 0.5870646766169154,\n\
\ \"acc_norm_stderr\": 0.034815208033673474\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3219094247246022,\n\
\ \"mc1_stderr\": 0.016355567611960404,\n \"mc2\": 0.5030678363563933,\n\
\ \"mc2_stderr\": 0.01589753382807047\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614662\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03411675511751327,\n \
\ \"acc_stderr\": 0.00500021260077329\n }\n}\n```"
repo_url: https://huggingface.co/davzoku/frankencria-llama2-12.5b-v1.3-m.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|arc:challenge|25_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|gsm8k|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hellaswag|10_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T15-39-16.700250.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- '**/details_harness|winogrande|5_2024-02-14T15-39-16.700250.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T15-39-16.700250.parquet'
- config_name: results
data_files:
- split: 2024_02_14T15_39_16.700250
path:
- results_2024-02-14T15-39-16.700250.parquet
- split: latest
path:
- results_2024-02-14T15-39-16.700250.parquet
---
# Dataset Card for Evaluation run of davzoku/frankencria-llama2-12.5b-v1.3-m.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davzoku/frankencria-llama2-12.5b-v1.3-m.2](https://huggingface.co/davzoku/frankencria-llama2-12.5b-v1.3-m.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T15:39:16.700250](https://huggingface.co/datasets/open-llm-leaderboard/details_davzoku__frankencria-llama2-12.5b-v1.3-m.2/blob/main/results_2024-02-14T15-39-16.700250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4617722987057413,
"acc_stderr": 0.03442751213597903,
"acc_norm": 0.4686971200766007,
"acc_norm_stderr": 0.03527698452163269,
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.5030678363563933,
"mc2_stderr": 0.01589753382807047
},
"harness|arc:challenge|25": {
"acc": 0.5110921501706485,
"acc_stderr": 0.01460779491401305,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284743
},
"harness|hellaswag|10": {
"acc": 0.6093407687711612,
"acc_stderr": 0.0048690101522807505,
"acc_norm": 0.7916749651463851,
"acc_norm_stderr": 0.004052804959005537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.028372287797962935,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.028372287797962935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868407,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868407
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.035212249088415845,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.035212249088415845
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41794871794871796,
"acc_stderr": 0.02500732988246122,
"acc_norm": 0.41794871794871796,
"acc_norm_stderr": 0.02500732988246122
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.020504729013829118,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.020504729013829118
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953426,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953426
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.031450686007448596,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.031450686007448596
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6495726495726496,
"acc_stderr": 0.0312561082442188,
"acc_norm": 0.6495726495726496,
"acc_norm_stderr": 0.0312561082442188
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6513409961685823,
"acc_stderr": 0.017041243143490974,
"acc_norm": 0.6513409961685823,
"acc_norm_stderr": 0.017041243143490974
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21675977653631284,
"acc_stderr": 0.01378059848644335,
"acc_norm": 0.21675977653631284,
"acc_norm_stderr": 0.01378059848644335
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.02821768355665232,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.02821768355665232
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347666,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347666
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35528031290743156,
"acc_stderr": 0.012223623364044037,
"acc_norm": 0.35528031290743156,
"acc_norm_stderr": 0.012223623364044037
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329387,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329387
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45751633986928103,
"acc_stderr": 0.020154685712590888,
"acc_norm": 0.45751633986928103,
"acc_norm_stderr": 0.020154685712590888
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4448979591836735,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.4448979591836735,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5870646766169154,
"acc_stderr": 0.034815208033673474,
"acc_norm": 0.5870646766169154,
"acc_norm_stderr": 0.034815208033673474
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3219094247246022,
"mc1_stderr": 0.016355567611960404,
"mc2": 0.5030678363563933,
"mc2_stderr": 0.01589753382807047
},
"harness|winogrande|5": {
"acc": 0.7024467245461721,
"acc_stderr": 0.012849085254614662
},
"harness|gsm8k|5": {
"acc": 0.03411675511751327,
"acc_stderr": 0.00500021260077329
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
GoodBaiBai88/M3D-Cap | ---
license: apache-2.0
tags:
- medical
- 3D medical image caption
- image-text pair
- medical report
size_categories:
- 100K<n<1M
---
## Dataset Description
Large-scale 3D medical multi-modal dataset - Image-Text Pair Dataset (M3D-Cap)
### Dataset Introduction
Medical institutions, such as hospitals, store vast amounts of multi-modal data,
including medical images and diagnostic reports.
However, disclosing these multi-modal datasets involving patient data faces challenges due to sensitivity and privacy concerns.
To circumvent these limitations, we collected medical images and reports from publicly accessible professional medical websites [Radiopaedia](https://radiopaedia.org/).
Specifically, each patient case in our dataset includes multiple images and corresponding reports, which experts from the Radiopaedia platform meticulously review.
Given the crucial role of 3D CT in medical image analysis, particularly in the diagnosis, localization, and measurement of systemic lesions,
we focus on 3D CT data. We successfully constructed the largest-scale 3D medical image-text paired dataset, M3D-Cap,
comprising 120K pairs of image-text data. Overall, it is divided into two data folders named ct_case and ct_quizze.
ct_quizze is used for medical exams and has higher quality. Each folder contains some image folders and one text file.
The image folders contain multiple 2D slices of 3D images, while the text files provide English reports describing the corresponding 3D images,
including types of abnormalities and lesions. M3D_Cap.json provides the split scheme.
### Supported Tasks
M3D-Cap supports various image-text multimodal tasks in 3D medical scenarios,
including image-text retrieval, report generation, and image generation.
## Dataset Format and Structure
### Data Format
<pre>
M3D_Seg/
ct_case/
000006/
Axial_non_contrast/
0.jpeg
1.jpeg
......
text.txt
......
ct_quizze/
000007/
Axial_non_contrast/
0.png
1.png
......
text.txt
......
......
</pre>
### Dataset Download
#### Clone with HTTP
```bash
git clone
```
#### Manual Download
Download all files from the dataset manually, which can be done using batch download tools.
Note: Due to the large size of the overall dataset, it is divided into subfiles of 20G each.
After downloading all files, extract them together to obtain the complete data.
### Dataset Loading Method
#### 1. Preprocessing
Combine slices under each folder in the dataset to form 3D images and name them according to the image file names (retain plane and phase information),
saving them as npy files. Filter the text reports in the dataset to obtain high-quality descriptions.
#### 2. Build Dataset
We provide sample code for building the dataset
```python
class CapDataset(Dataset):
def __init__(self, args, tokenizer, mode="train"):
self.args = args
self.data_root = args.data_root
self.tokenizer = tokenizer
self.mode = mode
self.image_tokens = "<im_patch>" * args.proj_out_num
with open(args.cap_data_path, 'r') as file:
self.json_file = json.load(file)
self.data_list = self.json_file[mode]
self.caption_prompts = [
"Can you provide a caption consists of findings for this medical image?",
"Describe the findings of the medical image you see.",
"Please caption this medical scan with findings.",
"What is the findings of this image?",
"Describe this medical scan with findings.",
"Please write a caption consists of findings for this image.",
"Can you summarize with findings the images presented?",
"Please caption this scan with findings.",
"Please provide a caption consists of findings for this medical image.",
"Can you provide a summary consists of findings of this radiograph?",
"What are the findings presented in this medical scan?",
"Please write a caption consists of findings for this scan.",
"Can you provide a description consists of findings of this medical scan?",
"Please caption this medical scan with findings.",
"Can you provide a caption consists of findings for this medical scan?"
]
train_transform = mtf.Compose(
[
mtf.RandRotate90(prob=0.5, spatial_axes=(1, 2)),
mtf.RandFlip(prob=0.10, spatial_axis=0),
mtf.RandFlip(prob=0.10, spatial_axis=1),
mtf.RandFlip(prob=0.10, spatial_axis=2),
mtf.RandScaleIntensity(factors=0.1, prob=0.5),
mtf.RandShiftIntensity(offsets=0.1, prob=0.5),
mtf.ToTensor(dtype=torch.float),
]
)
val_transform = mtf.Compose(
[
mtf.ToTensor(dtype=torch.float),
]
)
set_track_meta(False)
if mode == 'train':
self.transform = train_transform
elif mode == 'validation':
self.transform = val_transform
elif mode == 'test':
self.transform = val_transform
def __len__(self):
return len(self.data_list)
def __getitem__(self, idx):
max_attempts = 100
for _ in range(max_attempts):
try:
data = self.data_list[idx]
image_path = data["image"]
image_abs_path = os.path.join(self.data_root, image_path)
image = np.load(image_abs_path) # nomalized 0-1, C,D,H,W
image = self.transform(image)
text_path = data["text"]
text_abs_path = os.path.join(self.data_root, text_path)
with open(text_abs_path, 'r') as text_file:
raw_text = text_file.read()
answer = raw_text
prompt_question = random.choice(self.caption_prompts)
question = self.image_tokens + prompt_question
text_tensor = self.tokenizer(
question + ' ' + answer, max_length=self.args.max_length, truncation=True, padding="max_length", return_tensors="pt"
)
input_id = text_tensor["input_ids"][0]
attention_mask = text_tensor["attention_mask"][0]
valid_len = torch.sum(attention_mask)
if valid_len < len(input_id):
input_id[valid_len] = self.tokenizer.eos_token_id
question_tensor = self.tokenizer(
question, max_length=self.args.max_length, truncation=True, padding="max_length", return_tensors="pt"
)
question_len = torch.sum(question_tensor["attention_mask"][0])
label = input_id.clone()
label[label == self.tokenizer.pad_token_id] = -100
label[:question_len] = -100
ret = {
'image': image,
'input_id': input_id,
'label': label,
'attention_mask': attention_mask,
'question': question,
'answer': answer,
'question_type': "Caption",
}
return ret
except Exception as e:
print(f"Error in __getitem__ at index {idx}: {e}")
idx = random.randint(0, len(self.data_list) - 1)
```
### Data Splitting
The entire dataset is split into
‘train, validation, test100, test500, test1k, and test’ using a JSON file.
considering testing costs, we provide different quantities of test samples from 100 to 2k, with the number of ‘test‘ being 2k.
## Dataset Copyright Information
All images and reports involved in this dataset are publicly available data.
For detailed copyright information, please refer to the corresponding links.
## Citation
If you use this dataset, please cite the following works:
```BibTeX
@misc{bai2024m3d,
title={M3D: Advancing 3D Medical Image Analysis with Multi-Modal Large Language Models},
author={Fan Bai and Yuxin Du and Tiejun Huang and Max Q. -H. Meng and Bo Zhao},
year={2024},
eprint={2404.00578},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
@misc{du2024segvol,
title={SegVol: Universal and Interactive Volumetric Medical Image Segmentation},
author={Yuxin Du and Fan Bai and Tiejun Huang and Bo Zhao},
year={2024},
eprint={2311.13385},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
Isaacks/tissue-masker-dataset-without-damage | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 32167415.0
num_examples: 80
- name: validation
num_bytes: 3412353.0
num_examples: 9
download_size: 34278312
dataset_size: 35579768.0
---
# Dataset Card for "tissue-masker-dataset-without-damage"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.3_seed_1_t_1.0_eval | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
- name: gen_proxy_reward
dtype: float64
- name: gen_gold_reward
dtype: float64
splits:
- name: epoch_0
num_bytes: 44026189
num_examples: 18928
- name: epoch_1
num_bytes: 44652150
num_examples: 18928
- name: epoch_2
num_bytes: 44741481
num_examples: 18928
- name: epoch_3
num_bytes: 44789787
num_examples: 18928
- name: epoch_4
num_bytes: 44807465
num_examples: 18928
- name: epoch_5
num_bytes: 44808454
num_examples: 18928
- name: epoch_6
num_bytes: 44805793
num_examples: 18928
- name: epoch_7
num_bytes: 44796252
num_examples: 18928
- name: epoch_8
num_bytes: 44790569
num_examples: 18928
- name: epoch_9
num_bytes: 44787495
num_examples: 18928
- name: epoch_10
num_bytes: 44788385
num_examples: 18928
- name: epoch_11
num_bytes: 44787217
num_examples: 18928
- name: epoch_12
num_bytes: 44786387
num_examples: 18928
- name: epoch_13
num_bytes: 44785337
num_examples: 18928
- name: epoch_14
num_bytes: 44783315
num_examples: 18928
- name: epoch_15
num_bytes: 44783727
num_examples: 18928
- name: epoch_16
num_bytes: 44784999
num_examples: 18928
- name: epoch_17
num_bytes: 44784147
num_examples: 18928
- name: epoch_18
num_bytes: 44784601
num_examples: 18928
- name: epoch_19
num_bytes: 44784220
num_examples: 18928
- name: epoch_20
num_bytes: 44783326
num_examples: 18928
- name: epoch_21
num_bytes: 44783519
num_examples: 18928
- name: epoch_22
num_bytes: 44783523
num_examples: 18928
- name: epoch_23
num_bytes: 44784491
num_examples: 18928
- name: epoch_24
num_bytes: 44783568
num_examples: 18928
- name: epoch_25
num_bytes: 44783393
num_examples: 18928
- name: epoch_26
num_bytes: 44784171
num_examples: 18928
- name: epoch_27
num_bytes: 44784576
num_examples: 18928
- name: epoch_28
num_bytes: 44785050
num_examples: 18928
- name: epoch_29
num_bytes: 44784872
num_examples: 18928
download_size: 686226958
dataset_size: 1342698459
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
# Dataset Card for "fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.3_seed_1_t_1.0_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Wnzz-Latte/instruction_mining_demo | ---
license: afl-3.0
---
|
communityai/communityai_apt-instruct-code-micro-600k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 2409747348.0
num_examples: 578549
download_size: 1069969506
dataset_size: 2409747348.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-alpaca-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-alpaca-test](https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T20:49:48.067362](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test/blob/main/results_2023-08-29T20%3A49%3A48.067362.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5568910569830451,\n\
\ \"acc_stderr\": 0.03436225133323378,\n \"acc_norm\": 0.5610380147243772,\n\
\ \"acc_norm_stderr\": 0.034342335699213765,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.3693612523342933,\n\
\ \"mc2_stderr\": 0.014364347604420232\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348899,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946704\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6117307309300936,\n\
\ \"acc_stderr\": 0.004863603638367449,\n \"acc_norm\": 0.8128858793069109,\n\
\ \"acc_norm_stderr\": 0.003892060546588329\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552742,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552742\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836557,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836557\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454806,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.025339003010106522,\n\
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.025339003010106522\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534738,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534738\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241446,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241446\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398674,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398674\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977254,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249619,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249619\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.02700252103451647,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.02700252103451647\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n\
\ \"acc_stderr\": 0.012596744108998557,\n \"acc_norm\": 0.4178617992177314,\n\
\ \"acc_norm_stderr\": 0.012596744108998557\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.3693612523342933,\n\
\ \"mc2_stderr\": 0.014364347604420232\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T20:49:48.067362.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:49:48.067362.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T20:49:48.067362.parquet'
- config_name: results
data_files:
- split: 2023_08_29T20_49_48.067362
path:
- results_2023-08-29T20:49:48.067362.parquet
- split: latest
path:
- results_2023-08-29T20:49:48.067362.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-alpaca-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-alpaca-test](https://huggingface.co/CHIH-HUNG/llama-2-13b-alpaca-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T20:49:48.067362](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-alpaca-test/blob/main/results_2023-08-29T20%3A49%3A48.067362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5568910569830451,
"acc_stderr": 0.03436225133323378,
"acc_norm": 0.5610380147243772,
"acc_norm_stderr": 0.034342335699213765,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.3693612523342933,
"mc2_stderr": 0.014364347604420232
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348899,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946704
},
"harness|hellaswag|10": {
"acc": 0.6117307309300936,
"acc_stderr": 0.004863603638367449,
"acc_norm": 0.8128858793069109,
"acc_norm_stderr": 0.003892060546588329
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425072,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552742,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836557,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836557
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454806,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.025339003010106522,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.025339003010106522
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534738,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534738
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241446,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241446
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398674,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398674
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977254,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249619,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249619
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063145,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063145
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.02700252103451647,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.02700252103451647
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.012596744108998557,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.012596744108998557
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.3693612523342933,
"mc2_stderr": 0.014364347604420232
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/kamishirasawa_keine_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kamishirasawa_keine/上白沢慧音/上白沢慧音/카미시라사와케이네 (Touhou)
This is the dataset of kamishirasawa_keine/上白沢慧音/上白沢慧音/카미시라사와케이네 (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, hat, blue_hair, red_eyes, breasts, bangs, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 559.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 358.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1153 | 734.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 512.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1153 | 961.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kamishirasawa_keine_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, dress, scroll, open_mouth |
| 1 | 17 |  |  |  |  |  | 1girl, solo, blue_dress, looking_at_viewer, smile, very_long_hair, puffy_short_sleeves, white_background |
| 2 | 8 |  |  |  |  |  | 1girl, blue_dress, blue_headwear, looking_at_viewer, puffy_short_sleeves, simple_background, smile, solo, red_neckerchief, white_background, upper_body, closed_mouth, hair_between_eyes, two-tone_hair |
| 3 | 5 |  |  |  |  |  | 1girl, blue_dress, blue_headwear, blush, hair_between_eyes, holding, looking_at_viewer, puffy_short_sleeves, solo, chalkboard, cowboy_shot, open_mouth, red_neckerchief, tokin_hat, collarbone, indoors, cleavage, medium_breasts, smile |
| 4 | 7 |  |  |  |  |  | 1girl, blue_dress, blue_headwear, footwear_bow, full_body, looking_at_viewer, red_bow, red_neckerchief, shoes, solo, black_footwear, puffy_short_sleeves, simple_background, closed_mouth, standing, two-tone_hair, white_background, white_socks |
| 5 | 10 |  |  |  |  |  | 1girl, green_dress, solo, green_hair, puffy_short_sleeves, closed_mouth, full_moon, horn_bow, horn_ribbon, looking_at_viewer, red_bow, red_neckerchief, smile, night |
| 6 | 17 |  |  |  |  |  | 1girl, blue_dress, blush, solo, chibi, kemonomimi_mode, dog_ears, dog_tail, blue_eyes, fang, :3, expressive_clothes, open_mouth, smile, minigirl |
| 7 | 7 |  |  |  |  |  | 1girl, blush, nipples, solo, large_breasts, sweat, collarbone, grey_background, looking_at_viewer, open_mouth, simple_background, armpits, arms_behind_head, arms_up, blue_headwear, completely_nude, navel, sitting, upper_body |
| 8 | 16 |  |  |  |  |  | 1girl, blush, nipples, hetero, solo_focus, 1boy, penis, nude, pussy, huge_breasts, cum, large_breasts, open_mouth, pubic_hair, sex, vaginal, bar_censor, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | dress | scroll | open_mouth | blue_dress | looking_at_viewer | smile | very_long_hair | puffy_short_sleeves | white_background | blue_headwear | simple_background | red_neckerchief | upper_body | closed_mouth | hair_between_eyes | two-tone_hair | blush | holding | chalkboard | cowboy_shot | tokin_hat | collarbone | indoors | cleavage | medium_breasts | footwear_bow | full_body | red_bow | shoes | black_footwear | standing | white_socks | green_dress | green_hair | full_moon | horn_bow | horn_ribbon | night | chibi | kemonomimi_mode | dog_ears | dog_tail | blue_eyes | fang | :3 | expressive_clothes | minigirl | nipples | large_breasts | sweat | grey_background | armpits | arms_behind_head | arms_up | completely_nude | navel | sitting | hetero | solo_focus | 1boy | penis | nude | pussy | huge_breasts | cum | pubic_hair | sex | vaginal | bar_censor |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------|:-------------|:-------------|:--------------------|:--------|:-----------------|:----------------------|:-------------------|:----------------|:--------------------|:------------------|:-------------|:---------------|:--------------------|:----------------|:--------|:----------|:-------------|:--------------|:------------|:-------------|:----------|:-----------|:-----------------|:---------------|:------------|:----------|:--------|:-----------------|:-----------|:--------------|:--------------|:-------------|:------------|:-----------|:--------------|:--------|:--------|:------------------|:-----------|:-----------|:------------|:-------|:-----|:---------------------|:-----------|:----------|:----------------|:--------|:------------------|:----------|:-------------------|:----------|:------------------|:--------|:----------|:---------|:-------------|:-------|:--------|:-------|:--------|:---------------|:------|:-------------|:------|:----------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | | | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | X | X | X | X | | X | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | | X | X | | | X | X | X | X | X | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | X | | | | | X | X | | X | | | | X | | X | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 17 |  |  |  |  |  | X | X | | | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | | | X | | X | | | | | X | X | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 16 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo | ---
pretty_name: Evaluation run of wandb/mistral-7b-zephyr-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wandb/mistral-7b-zephyr-dpo](https://huggingface.co/wandb/mistral-7b-zephyr-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T21:42:03.928518](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo/blob/main/results_2024-03-11T21-42-03.928518.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6197175143025987,\n\
\ \"acc_stderr\": 0.032785226600484156,\n \"acc_norm\": 0.6241561892365968,\n\
\ \"acc_norm_stderr\": 0.03344678060029092,\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5660736416141117,\n\
\ \"mc2_stderr\": 0.015703591472463297\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n\
\ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6693885680143398,\n\
\ \"acc_stderr\": 0.004694718918225753,\n \"acc_norm\": 0.8578968333001394,\n\
\ \"acc_norm_stderr\": 0.003484423442092664\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880263,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880263\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"\
acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940798,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940798\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.01678548115920363,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.01678548115920363\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139953,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.01618544417945717,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.01618544417945717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559806,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559806\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768917,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768917\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5660736416141117,\n\
\ \"mc2_stderr\": 0.015703591472463297\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4086429112964367,\n \
\ \"acc_stderr\": 0.013540639733342422\n }\n}\n```"
repo_url: https://huggingface.co/wandb/mistral-7b-zephyr-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|arc:challenge|25_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|arc:challenge|25_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|gsm8k|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|gsm8k|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hellaswag|10_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hellaswag|10_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|winogrande|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|winogrande|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T21-42-03.928518.parquet'
- config_name: results
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- results_2024-03-10T17-40-34.142017.parquet
- split: 2024_03_11T21_42_03.928518
path:
- results_2024-03-11T21-42-03.928518.parquet
- split: latest
path:
- results_2024-03-11T21-42-03.928518.parquet
---
# Dataset Card for Evaluation run of wandb/mistral-7b-zephyr-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wandb/mistral-7b-zephyr-dpo](https://huggingface.co/wandb/mistral-7b-zephyr-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T21:42:03.928518](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo/blob/main/results_2024-03-11T21-42-03.928518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6197175143025987,
"acc_stderr": 0.032785226600484156,
"acc_norm": 0.6241561892365968,
"acc_norm_stderr": 0.03344678060029092,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5660736416141117,
"mc2_stderr": 0.015703591472463297
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.01423587248790987,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955012
},
"harness|hellaswag|10": {
"acc": 0.6693885680143398,
"acc_stderr": 0.004694718918225753,
"acc_norm": 0.8578968333001394,
"acc_norm_stderr": 0.003484423442092664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940798,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940798
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.01678548115920363,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.01678548115920363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139953,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.01618544417945717,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.01618544417945717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559806,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768917,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768917
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5660736416141117,
"mc2_stderr": 0.015703591472463297
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.4086429112964367,
"acc_stderr": 0.013540639733342422
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71 | ---
pretty_name: Evaluation run of lemon-mint/gemma-ko-7b-instruct-v0.71
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lemon-mint/gemma-ko-7b-instruct-v0.71](https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.71)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T06:37:34.656936](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71/blob/main/results_2024-04-09T06-37-34.656936.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5942899564374612,\n\
\ \"acc_stderr\": 0.03331686070230768,\n \"acc_norm\": 0.5993757855966777,\n\
\ \"acc_norm_stderr\": 0.03398353543840741,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.5172148516529776,\n\
\ \"mc2_stderr\": 0.015636005438812176\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.01453714444428473\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5894244174467238,\n\
\ \"acc_stderr\": 0.004909328992915072,\n \"acc_norm\": 0.7746464847639912,\n\
\ \"acc_norm_stderr\": 0.004169610254807967\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798335,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798335\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.025649381063029268,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.025649381063029268\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915435,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915435\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475524,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475524\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.015016884698539892,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.015016884698539892\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369918,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369918\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702335,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702335\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026229649178821157,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026229649178821157\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032205,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032205\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065684,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065684\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.5172148516529776,\n\
\ \"mc2_stderr\": 0.015636005438812176\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6977111286503551,\n \"acc_stderr\": 0.012907200361627532\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41698256254738436,\n \
\ \"acc_stderr\": 0.013581320997216591\n }\n}\n```"
repo_url: https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.71
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|winogrande|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T06-37-34.656936.parquet'
- config_name: results
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- results_2024-04-09T06-37-34.656936.parquet
- split: latest
path:
- results_2024-04-09T06-37-34.656936.parquet
---
# Dataset Card for Evaluation run of lemon-mint/gemma-ko-7b-instruct-v0.71
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lemon-mint/gemma-ko-7b-instruct-v0.71](https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.71) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T06:37:34.656936](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71/blob/main/results_2024-04-09T06-37-34.656936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5942899564374612,
"acc_stderr": 0.03331686070230768,
"acc_norm": 0.5993757855966777,
"acc_norm_stderr": 0.03398353543840741,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.5172148516529776,
"mc2_stderr": 0.015636005438812176
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.01453714444428473
},
"harness|hellaswag|10": {
"acc": 0.5894244174467238,
"acc_stderr": 0.004909328992915072,
"acc_norm": 0.7746464847639912,
"acc_norm_stderr": 0.004169610254807967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798335,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798335
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029268,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029268
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915435,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475524,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475524
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560403,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.015016884698539892,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.015016884698539892
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369918,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702335,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702335
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630998,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026229649178821157,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026229649178821157
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032205,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829156,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829156
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065684,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065684
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.5172148516529776,
"mc2_stderr": 0.015636005438812176
},
"harness|winogrande|5": {
"acc": 0.6977111286503551,
"acc_stderr": 0.012907200361627532
},
"harness|gsm8k|5": {
"acc": 0.41698256254738436,
"acc_stderr": 0.013581320997216591
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
heliosprime/twitter_dataset_1713183593 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11375
num_examples: 33
download_size: 12621
dataset_size: 11375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713183593"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
daqc/constitucion_politica_del_peru_1993_qa_raw | ---
dataset_info:
features:
- name: context
dtype: string
- name: input
dtype: string
- name: instruction-rating
sequence: 'null'
- name: instruction-rating-suggestion
dtype: float64
- name: instruction-rating-suggestion-metadata
struct:
- name: agent
dtype: 'null'
- name: score
dtype: 'null'
- name: type
dtype: 'null'
- name: curated-instruction
sequence: 'null'
- name: curated-instruction-suggestion
dtype: 'null'
- name: curated-instruction-suggestion-metadata
struct:
- name: agent
dtype: 'null'
- name: score
dtype: 'null'
- name: type
dtype: 'null'
- name: external_id
dtype: 'null'
- name: metadata
dtype: string
- name: vectors
struct:
- name: input
dtype: 'null'
- name: instructions
dtype: 'null'
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 9733509
num_examples: 2075
download_size: 2003275
dataset_size: 9733509
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
o2satz/MedText_modified | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Completion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1912392
num_examples: 1412
download_size: 991577
dataset_size: 1912392
---
# Dataset Card for "MedText_modified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ghadaabbes/brexit | ---
license: postgresql
--- |
DBQ/Gucci.Product.prices.South.Korea | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: South Korea - Gucci - Product-level price list
tags:
- webscraping
- ecommerce
- Gucci
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 1709874
num_examples: 3796
download_size: 490827
dataset_size: 1709874
---
# Gucci web scraped data
## About the website
In the flourishing **fashion and luxury goods industry** in the Asia Pacific region, particularly **South Korea**, high-end brands like **Gucci** are experiencing a surge in popularity. E-commerce is a major driving force; this upward trend reflected in the dataset gathered, which includes **Ecommerce product-list page (PLP)** data on Guccis presence in South Korea. The country’s exponential rise in individual wealth and the strong influence of pop culture on fashion trends have paved the way for an increase in consumer demand. This illustrates the growing appreciation for luxury fashion brands in the Asia Pacific region.
## Link to **dataset**
[South Korea - Gucci - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Gucci%20Product-prices%20South%20Korea/r/recvqhieM1soAYRdL)
|
cakiki/rust_paths | ---
dataset_info:
features:
- name: repository_name
dtype: string
splits:
- name: train
num_bytes: 71297350
num_examples: 3087525
download_size: 49706578
dataset_size: 71297350
---
# Dataset Card for "rust_paths"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_liminerity__dhbacmes-3b-slerp | ---
pretty_name: Evaluation run of liminerity/dhbacmes-3b-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/dhbacmes-3b-slerp](https://huggingface.co/liminerity/dhbacmes-3b-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__dhbacmes-3b-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T20:03:05.684922](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__dhbacmes-3b-slerp/blob/main/results_2024-02-29T20-03-05.684922.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5276648412815843,\n\
\ \"acc_stderr\": 0.03438428439388686,\n \"acc_norm\": 0.5310695726530207,\n\
\ \"acc_norm_stderr\": 0.03508696025146873,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.40412360273636533,\n\
\ \"mc2_stderr\": 0.014383564900315697\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4061433447098976,\n \"acc_stderr\": 0.014351656690097862,\n\
\ \"acc_norm\": 0.4522184300341297,\n \"acc_norm_stderr\": 0.014544519880633832\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5204142601075483,\n\
\ \"acc_stderr\": 0.004985620773683433,\n \"acc_norm\": 0.7077275443138817,\n\
\ \"acc_norm_stderr\": 0.004538773493746559\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467516,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467516\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.0381189098894041,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.0381189098894041\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752052,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752052\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.041049472699033945,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.041049472699033945\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.02721888977330876,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.02721888977330876\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6464646464646465,\n \"acc_stderr\": 0.03406086723547155,\n \"\
acc_norm\": 0.6464646464646465,\n \"acc_norm_stderr\": 0.03406086723547155\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n\
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6788990825688074,\n \"acc_stderr\": 0.020018149772733744,\n \"\
acc_norm\": 0.6788990825688074,\n \"acc_norm_stderr\": 0.020018149772733744\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"\
acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.03087453753755362,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.03087453753755362\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652268,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652268\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n\
\ \"acc_stderr\": 0.016808322261740463,\n \"acc_norm\": 0.6704980842911877,\n\
\ \"acc_norm_stderr\": 0.016808322261740463\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546538,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546538\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.01421957078810399,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.01421957078810399\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.02784647600593047,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.02784647600593047\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.027487472980871588,\n\
\ \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.027487472980871588\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40091264667535853,\n\
\ \"acc_stderr\": 0.012516960350640814,\n \"acc_norm\": 0.40091264667535853,\n\
\ \"acc_norm_stderr\": 0.012516960350640814\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150127,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150127\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.40412360273636533,\n\
\ \"mc2_stderr\": 0.014383564900315697\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6511444356748224,\n \"acc_stderr\": 0.013395059320137334\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43669446550416985,\n \
\ \"acc_stderr\": 0.013661649780905493\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/dhbacmes-3b-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|arc:challenge|25_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|gsm8k|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hellaswag|10_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-03-05.684922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T20-03-05.684922.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T20_03_05.684922
path:
- '**/details_harness|winogrande|5_2024-02-29T20-03-05.684922.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T20-03-05.684922.parquet'
- config_name: results
data_files:
- split: 2024_02_29T13_07_58.432043
path:
- results_2024-02-29T13-07-58.432043.parquet
- split: 2024_02_29T20_03_05.684922
path:
- results_2024-02-29T20-03-05.684922.parquet
- split: latest
path:
- results_2024-02-29T20-03-05.684922.parquet
---
# Dataset Card for Evaluation run of liminerity/dhbacmes-3b-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/dhbacmes-3b-slerp](https://huggingface.co/liminerity/dhbacmes-3b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__dhbacmes-3b-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T20:03:05.684922](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__dhbacmes-3b-slerp/blob/main/results_2024-02-29T20-03-05.684922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5276648412815843,
"acc_stderr": 0.03438428439388686,
"acc_norm": 0.5310695726530207,
"acc_norm_stderr": 0.03508696025146873,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.40412360273636533,
"mc2_stderr": 0.014383564900315697
},
"harness|arc:challenge|25": {
"acc": 0.4061433447098976,
"acc_stderr": 0.014351656690097862,
"acc_norm": 0.4522184300341297,
"acc_norm_stderr": 0.014544519880633832
},
"harness|hellaswag|10": {
"acc": 0.5204142601075483,
"acc_stderr": 0.004985620773683433,
"acc_norm": 0.7077275443138817,
"acc_norm_stderr": 0.004538773493746559
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467516,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467516
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.0381189098894041,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.0381189098894041
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752052,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752052
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.041049472699033945,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.041049472699033945
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.02721888977330876,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.02721888977330876
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6464646464646465,
"acc_stderr": 0.03406086723547155,
"acc_norm": 0.6464646464646465,
"acc_norm_stderr": 0.03406086723547155
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6788990825688074,
"acc_stderr": 0.020018149772733744,
"acc_norm": 0.6788990825688074,
"acc_norm_stderr": 0.020018149772733744
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.03087453753755362,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.03087453753755362
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652268,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652268
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.016808322261740463,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.016808322261740463
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546538,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546538
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810399,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.02784647600593047,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.02784647600593047
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.027487472980871588,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.027487472980871588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40091264667535853,
"acc_stderr": 0.012516960350640814,
"acc_norm": 0.40091264667535853,
"acc_norm_stderr": 0.012516960350640814
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150127,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150127
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.40412360273636533,
"mc2_stderr": 0.014383564900315697
},
"harness|winogrande|5": {
"acc": 0.6511444356748224,
"acc_stderr": 0.013395059320137334
},
"harness|gsm8k|5": {
"acc": 0.43669446550416985,
"acc_stderr": 0.013661649780905493
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MoonIcee/gabrieljunio1 | ---
license: openrail
---
|
sidovic/LearningQ-qg | ---
license: unknown
task_categories:
- text-generation
language:
- en
tags:
- question generation
pretty_name: LeaningQ-qg
size_categories:
- 100K<n<1M
train-eval-index:
- config: plain_text
task: question-generation
task_id: extractive_question_generation
splits:
train_split: train
eval_split: validation
test_split: test
col_mapping:
context: context
questionsrc: question source
question: question
metrics:
- type: squad
name: SQuAD
dataset_info:
features:
- name: context
dtype: string
- name: questionsrc
dtype: string
- name: question
dtype: string
config_name: plain_text
splits:
- name: train
num_examples: 188660
- name: validation
num_examples: 20630
- name: test
num_examples: 18227
---
# Dataset Card for LearningQ-qg
## Dataset Description
- **Repository:** [GitHub](https://github.com/AngusGLChen/LearningQ#readme)
- **Paper:** [LearningQ: A Large-scale Dataset for Educational Question Generation](https://ojs.aaai.org/index.php/ICWSM/article/view/14987/14837)
- **Point of Contact:** s.lamri@univ-bouira.dz
### Dataset Summary
LearningQ, a challenging educational question generation dataset containing over 230K document-question pairs by [Guanliang Chen, Jie Yang, Claudia Hauff and Geert-Jan Houben]. It includes 7K instructor-designed questions assessing knowledge concepts being taught and 223K learner-generated questions seeking in-depth understanding of the taught concepts. This new version collected and corrected from over than 50000 error and more than 1500 type of error by [Sidali Lamri](https://dz.linkedin.com/in/sidali-lamri)
### Use the dataset
```python
from datasets import load_dataset
lq_dataset = load_dataset("sidovic/LearningQ-qg")
lq_dataset["train"][1]
len(lq_dataset["train"]),len(lq_dataset["validation"]),len(lq_dataset["test"])
```
### Supported Tasks and Leaderboards
[Question generation]
### Languages
[English]
## Dataset Structure
### Data Instances
An example of example looks as follows.
```
{
"context": "This is a test context.",
"questionsrc": "test context",
"question": "Is this a test?"
}
```
### Data Fields
The data fields are the same among all splits.
- `context`: a `string` feature.
- `questionsrc`: a `string` feature.
- `question`: a `string` feature.
### Data Splits
| name |train |validation|test |
|----------|-----:|---------:|----:|
|LearningQ |188660| 20630|18227|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
{
author = {Sidali Lamri},
title = {new LearningQ version for Question generation in transformers},
year = {2023}
}
@paper{ICWSM18LearningQ,
author = {Guanliang Chen, Jie Yang, Claudia Hauff and Geert-Jan Houben},
title = {LearningQ: A Large-scale Dataset for Educational Question Generation},
conference = {International AAAI Conference on Web and Social Media},
year = {2018}
}
```
### Contributions
[More Information Needed] |
zsy12345/telugu-asr | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 38112606621.4
num_examples: 168076
download_size: 37317736585
dataset_size: 38112606621.4
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
marsyas/gtzan | ---
pretty_name: GTZAN
---
# Dataset Card for GTZAN
## Table of Contents
- [Dataset Card for GTZAN](#dataset-card-for-gtzan)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://marsyas.info/downloads/datasets.html](http://marsyas.info/downloads/datasets.html)
- **Paper:** [http://ismir2001.ismir.net/pdf/tzanetakis.pdf](http://ismir2001.ismir.net/pdf/tzanetakis.pdf)
- **Point of Contact:**
### Dataset Summary
GTZAN is a dataset for musical genre classification of audio signals. The dataset consists of 1,000 audio tracks, each of 30 seconds long. It contains 10 genres, each represented by 100 tracks. The tracks are all 22,050Hz Mono 16-bit audio files in WAV format. The genres are: blues, classical, country, disco, hiphop, jazz, metal, pop, reggae, and rock.
### Languages
English
## Dataset Structure
GTZAN is distributed as a single dataset without a predefined training and test split. The information below refers to the single `train` split that is assigned by default.
### Data Instances
An example of GTZAN looks as follows:
```python
{
"file": "/path/to/cache/genres/blues/blues.00000.wav",
"audio": {
"path": "/path/to/cache/genres/blues/blues.00000.wav",
"array": array(
[
0.00732422,
0.01660156,
0.00762939,
...,
-0.05560303,
-0.06106567,
-0.06417847,
],
dtype=float32,
),
"sampling_rate": 22050,
},
"genre": 0,
}
```
### Data Fields
The types associated with each of the data fields is as follows:
* `file`: a `string` feature.
* `audio`: an `Audio` feature containing the `path` of the sound file, the decoded waveform in the `array` field, and the `sampling_rate`.
* `genre`: a `ClassLabel` feature.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@misc{tzanetakis_essl_cook_2001,
author = "Tzanetakis, George and Essl, Georg and Cook, Perry",
title = "Automatic Musical Genre Classification Of Audio Signals",
url = "http://ismir2001.ismir.net/pdf/tzanetakis.pdf",
publisher = "The International Society for Music Information Retrieval",
year = "2001"
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun) for adding this dataset. |
qbao775/PARARULE-Plus | ---
license: mit
task_categories:
- text-classification
- question-answering
language:
- en
tags:
- Reasoning
- Multi-Step-Deductive-Reasoning
- Logical-Reasoning
size_categories:
- 100K<n<1M
---
# PARARULE-Plus
This is a branch which includes the dataset from PARARULE-Plus Depth=2, Depth=3, Depth=4 and Depth=5. PARARULE Plus is a deep multi-step reasoning dataset over natural language. It can be seen as an improvement on the dataset of PARARULE (Peter Clark et al., 2020). Both PARARULE and PARARULE-Plus follow the closed-world assumption and negation as failure. The motivation is to generate deeper PARARULE training samples. We add more training samples for the case where the depth is greater than or equal to two to explore whether Transformer has reasoning ability. PARARULE Plus is a combination of two types of entities, animals and people, and corresponding relationships and attributes. From the depth of 2 to the depth of 5, we have around 100,000 samples in the depth of each layer, and there are nearly 400,000 samples in total.
Here is the original links for PARARULE-Plus including paper, project and data.
Paper: https://www.cs.ox.ac.uk/isg/conferences/tmp-proceedings/NeSy2022/paper15.pdf
Project: https://github.com/Strong-AI-Lab/Multi-Step-Deductive-Reasoning-Over-Natural-Language
Data: https://github.com/Strong-AI-Lab/PARARULE-Plus
PARARULE-Plus has been collected and merged by [LogiTorch.ai](https://www.logitorch.ai/), [ReasoningNLP](https://github.com/FreedomIntelligence/ReasoningNLP), [Prompt4ReasoningPapers](https://github.com/zjunlp/Prompt4ReasoningPapers) and [OpenAI/Evals](https://github.com/openai/evals/pull/651).
In this huggingface version, we pre-processed the dataset and use `1` to represent `true` and `0` to represent `false` to better help user train model.
## How to load the dataset?
```
from datasets import load_dataset
dataset = load_dataset("qbao775/PARARULE-Plus")
```
## How to train a model using the dataset?
We provide an [example](https://github.com/Strong-AI-Lab/PARARULE-Plus/blob/main/README.md#an-example-script-to-load-pararule-plus-and-fine-tune-bert) that you can `git clone` the project and fine-tune the dataset locally.
## Citation
```
@inproceedings{bao2022multi,
title={Multi-Step Deductive Reasoning Over Natural Language: An Empirical Study on Out-of-Distribution Generalisation},
author={Qiming Bao and Alex Yuxuan Peng and Tim Hartill and Neset Tan and Zhenyun Deng and Michael Witbrock and Jiamou Liu},
year={2022},
publisher={The 2nd International Joint Conference on Learning and Reasoning and 16th International Workshop on Neural-Symbolic Learning and Reasoning (IJCLR-NeSy 2022)}
}
``` |
danyoung/finance-feedback | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response_a
dtype: string
- name: response_b
dtype: string
- name: label
dtype: int64
- name: bad
dtype: int64
splits:
- name: train
num_bytes: 1167091
num_examples: 457
download_size: 671478
dataset_size: 1167091
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vinci-grape/test_case_trigger | ---
task_categories:
- text2text-generation
language:
- zh
tags:
- code
--- |
autoevaluate/autoeval-eval-project-quoref-bbfe943f-1305449897 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- quoref
eval_info:
task: extractive_question_answering
model: nbroad/rob-base-gc1
metrics: []
dataset_name: quoref
dataset_config: default
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: nbroad/rob-base-gc1
* Dataset: quoref
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model. |
Sleoruiz/disc_cla_sexta | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 27186116
num_examples: 7591
download_size: 14208855
dataset_size: 27186116
---
# Dataset Card for "disc_cla_sexta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alisson40889/valdomiro | ---
license: openrail
---
|
AlekseyKorshuk/synthetic-role-play-lmgym | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 52514808
num_examples: 33819
download_size: 15323715
dataset_size: 52514808
---
# Dataset Card for "synthetic-role-play-lmgym"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jan-hq/openhermes-2.5_binarized | ---
dataset_info:
features:
- name: skip_prompt_formatting
dtype: bool
- name: hash
sequence: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: title
dtype: string
- name: model
dtype: string
- name: avatarUrl
dtype: string
- name: views
dtype: int64
- name: idx
dtype: string
- name: source
dtype: string
- name: model_name
dtype: string
- name: category
dtype: string
- name: custom_instruction
dtype: bool
- name: topic
dtype: string
- name: language
dtype: string
- name: system_prompt
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 2947161635.4993954
num_examples: 901395
- name: test
num_bytes: 327465673.50060457
num_examples: 100156
download_size: 1692152446
dataset_size: 3274627309.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ai-habitat/hab_fetch | ---
license: cc-by-nc-sa-4.0
pretty_name: Habitat Fetch Robot
viewer: false
---

# Fetch Robotics Mobile Manipulator
Simulation model (URDF) of Fetch Robotics' mobile manipulator research platform for use in [habitat-sim](https://github.com/facebookresearch/habitat-sim).
Adapted from https://github.com/fetchrobotics/fetch_ros
See LICENSE.txt for more details.
|
fathyshalab/massive_takeaway-de | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 16746
num_examples: 257
- name: validation
num_bytes: 2767
num_examples: 44
- name: test
num_bytes: 3656
num_examples: 57
download_size: 16262
dataset_size: 23169
---
# Dataset Card for "massive_takeaway-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gpaiva/NERDE | ---
annotations_creators:
- expert-generated
language:
- pt
language_creators:
- expert-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: NERDE
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- ner
- portuguese-ner
- economic-defense
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset Card for NERDE
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [NERDE repository](https://github.com/guipaiva/NERDE)
- **Point of Contact:** [Guilherme P. Paiva](mailto:guipaivagpp@gmail.com)
### Dataset Summary
NERDE is a dataset for Named Entity Recognition for Economic Defense. It was created in collaboration with LATITUDE/UnB Laboratory and the Administrative Council for Economic Defense (Cade)
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The language in the dataset is Brazilian Portuguese from legal documents. The BCP-47 code for Brazilian Portuguese is pt-BR
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@guipaiva](https://github.com/guipaiva) for adding this dataset.
|
mHossain/final_train_v2_270000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 9136682.1
num_examples: 27000
- name: test
num_bytes: 1015186.9
num_examples: 3000
download_size: 4450927
dataset_size: 10151869.0
---
# Dataset Card for "final_train_v2_270000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DeliberatorArchiver/Reverse1999MediaData | ---
viewer: false
--- |
ashu3984/PHYSIGEN-phy-alpaca | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 731662
num_examples: 785
download_size: 263578
dataset_size: 731662
---
# Dataset Card for "PHYSIGEN-phy-alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jenyag/repo-codegen-py-all-context-path-distance | ---
dataset_info:
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 590554966
num_examples: 224
download_size: 236585246
dataset_size: 590554966
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "repo-codegen-py-all-context-path-distance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/CIFAR100_test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple
'1': aquarium_fish
'2': baby
'3': bear
'4': beaver
'5': bed
'6': bee
'7': beetle
'8': bicycle
'9': bottle
'10': bowl
'11': boy
'12': bridge
'13': bus
'14': butterfly
'15': camel
'16': can
'17': castle
'18': caterpillar
'19': cattle
'20': chair
'21': chimpanzee
'22': clock
'23': cloud
'24': cockroach
'25': couch
'26': cra
'27': crocodile
'28': cup
'29': dinosaur
'30': dolphin
'31': elephant
'32': flatfish
'33': forest
'34': fox
'35': girl
'36': hamster
'37': house
'38': kangaroo
'39': keyboard
'40': lamp
'41': lawn_mower
'42': leopard
'43': lion
'44': lizard
'45': lobster
'46': man
'47': maple_tree
'48': motorcycle
'49': mountain
'50': mouse
'51': mushroom
'52': oak_tree
'53': orange
'54': orchid
'55': otter
'56': palm_tree
'57': pear
'58': pickup_truck
'59': pine_tree
'60': plain
'61': plate
'62': poppy
'63': porcupine
'64': possum
'65': rabbit
'66': raccoon
'67': ray
'68': road
'69': rocket
'70': rose
'71': sea
'72': seal
'73': shark
'74': shrew
'75': skunk
'76': skyscraper
'77': snail
'78': snake
'79': spider
'80': squirrel
'81': streetcar
'82': sunflower
'83': sweet_pepper
'84': table
'85': tank
'86': telephone
'87': television
'88': tiger
'89': tractor
'90': train
'91': trout
'92': tulip
'93': turtle
'94': wardrobe
'95': whale
'96': willow_tree
'97': wolf
'98': woman
'99': worm
- name: id
dtype: int64
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
splits:
- name: test
num_bytes: 27693774.0
num_examples: 10000
download_size: 23948177
dataset_size: 27693774.0
---
# Dataset Card for "CIFAR100_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xrizs/prob | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': test
'1': train
'2': valid
splits:
- name: train
num_bytes: 4113145.0
num_examples: 58
- name: validation
num_bytes: 1480042.0
num_examples: 20
- name: test
num_bytes: 622722.0
num_examples: 9
download_size: 6223810
dataset_size: 6215909.0
---
# Dataset Card for "prob"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OpenDatasets/dalle-3-dataset | ---
language:
- en
license:
- cc0-1.0
tags:
- image-text-dataset
- synthetic-dataset
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
- name: link
dtype: string
- name: message_id
dtype: string
- name: timestamp
dtype: string
splits:
- name: train
num_bytes: 25851562139.271
num_examples: 14927
download_size: 25829593712
dataset_size: 25851562139.271
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for LAION DALL·E 3 Discord Dataset
**Description**: This dataset consists of caption and image pairs scraped from the LAION [share-dalle-3 discord channel](https://discord.com/channels/823813159592001537/1158354590463447092). The purpose is to collect image-text pairs for research and exploration.
**Source Code**: The code used to generate this data can be found [here](https://github.com/LAION-AI/Discord-Scrapers.git).
## Contributors
- [Zach Nagengast](https://github.com/ZachNagengast)
- [Eduardo Pach](https://github.com/EduardoPach)
- [Seva Maltsev](https://github.com/TwoAbove)
- The [LAION community](https://discord.com/invite/eq3cAMZtCC)
## Data Attributes
- **caption**: The text description or prompt associated with the image. Data type: string.
- **image**: The embedded image data from the discord message attachment. Data type: image.
- **link**: The URL to the associated image. Data type: string.
- **message_id**: The discord message id where the image was posted. Data type: string.
- **timestamp**: Time the original message was posted. Datatype: string. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.