datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Multimodal-Fatima/OxfordFlowers_test_facebook_opt_2.7b_Attributes_Caption_ns_6149 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 269129188.375
num_examples: 6149
- name: fewshot_0_bs_16
num_bytes: 267298050.375
num_examples: 6149
- name: fewshot_3_bs_16
num_bytes: 272760392.375
num_examples: 6149
download_size: 796875873
dataset_size: 809187631.125
---
# Dataset Card for "OxfordFlowers_test_facebook_opt_2.7b_Attributes_Caption_ns_6149"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phucdev/noisyner | ---
annotations_creators:
- expert-generated
language:
- et
language_creators:
- found
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
paperswithcode_id: noisyner
pretty_name: NoisyNER
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- newspapers
- 1997-2009
task_categories:
- token-classification
task_ids:
- named-entity-recognition
dataset_info:
- config_name: estner_clean
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6258130
dataset_size: 9525735
- config_name: NoisyNER_labelset1
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6194276
dataset_size: 9525735
- config_name: NoisyNER_labelset2
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6201072
dataset_size: 9525735
- config_name: NoisyNER_labelset3
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6231384
dataset_size: 9525735
- config_name: NoisyNER_labelset4
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6201072
dataset_size: 9525735
- config_name: NoisyNER_labelset5
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6231384
dataset_size: 9525735
- config_name: NoisyNER_labelset6
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6226516
dataset_size: 9525735
- config_name: NoisyNER_labelset7
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: lemmas
sequence: string
- name: grammar
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
splits:
- name: train
num_bytes: 7544221
num_examples: 11365
- name: validation
num_bytes: 986310
num_examples: 1480
- name: test
num_bytes: 995204
num_examples: 1433
download_size: 6229668
dataset_size: 9525735
---
# Dataset Card for NoisyNER
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [Estonian NER corpus](https://doi.org/10.15155/1-00-0000-0000-0000-00073L), [NoisyNER dataset](https://github.com/uds-lsv/NoisyNER)
- **Paper:** [Named Entity Recognition in Estonian](https://aclanthology.org/W13-2412/), [Analysing the Noise Model Error for Realistic Noisy Label Data](https://arxiv.org/abs/2101.09763)
- **Dataset:** NoisyNER
- **Domain:** News
- **Size of downloaded dataset files:** 6.23 MB
- **Size of the generated dataset files:** 9.53 MB
### Dataset Summary
NoisyNER is a dataset for the evaluation of methods to handle noisy labels when training machine learning models.
- Entity Types: `PER`, `ORG`, `LOC`
It is from the NLP/Information Extraction domain and was created through a realistic distant supervision technique. Some highlights and interesting aspects of the data are:
- Seven sets of labels with differing noise patterns to evaluate different noise levels on the same instances
- Full parallel clean labels available to compute upper performance bounds or study scenarios where a small amount of gold-standard data can be leveraged
- Skewed label distribution (typical for Named Entity Recognition tasks)
- For some label sets: noise level higher than the true label probability
- Sequential dependencies between the labels
For more details on the dataset and its creation process, please refer to the original author's publication https://ojs.aaai.org/index.php/AAAI/article/view/16938 (published at AAAI'21).
This dataset is based on the Estonian NER corpus. For more details see https://aclanthology.org/W13-2412/
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
The language data in NoisyNER is in Estonian (BCP-47 et)
## Dataset Structure
### Data Instances
An example of 'train' looks as follows.
```
{
'id': '0',
'tokens': ['Tallinna', 'õhusaaste', 'suureneb', '.'],
'lemmas': ['Tallinn+0', 'õhu_saaste+0', 'suurene+b', '.'],
'grammar': ['_H_ sg g', '_S_ sg n', '_V_ b', '_Z_'],
'ner_tags': [5, 0, 0, 0]
}
```
### Data Fields
The data fields are the same among all splits.
- `id`: a `string` feature.
- `tokens`: a `list` of `string` features.
- `lemmas`: a `list` of `string` features.
- `grammar`: a `list` of `string` features.
- `ner_tags`: a `list` of classification labels (`int`). Full tagset with indices:
```python
{'O': 0, 'B-PER': 1, 'I-PER': 2, 'B-ORG': 3, 'I-ORG': 4, 'B-LOC': 5, 'I-LOC': 6}
```
### Data Splits
The splits are the same across all configurations.
|train|validation|test|
|----:|---------:|---:|
|11365| 1480|1433|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
Tkachenko et al (2013) collected 572 news stories published in the local online newspapers [Delfi](http://delfi.ee/) and [Postimees](http://postimees.ee/) between 1997 and 2009. Selected articles cover both local and international news on a range of topics including politics, economics and sports. The raw text was preprocessed using the morphological disambiguator t3mesta ([Kaalep and
Vaino, 1998](https://www.cl.ut.ee/yllitised/kk_yhest_1998.pdf)) provided by [Filosoft](http://www.filosoft.ee/). The processing steps involve tokenization, lemmatization, part-of-speech tagging, grammatical and morphological analysis.
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
According to Tkachenko et al (2013) one of the authors manually tagged the corpus and the other author examined the tags, after which conflicting cases were resolved.
The total size of the corpus is 184,638 tokens. Tkachenko et al (2013) provide the following number of named entities in the corpus:
| | PER | LOC | ORG | Total |
|--------|------|------|------|-------|
| All | 5762 | 5711 | 3938 | 15411 |
| Unique | 3588 | 1589 | 1987 | 7164 |
Hedderich et al (2021) obtained the noisy labels through a distant supervision/automatic annotation approach. They extracted lists of named entities from Wikidata and matched them against words in the text via the ANEA tool ([Hedderich, Lange, and Klakow 2021](https://arxiv.org/abs/2102.13129)). They also used heuristic functions to correct errors caused by non-complete lists of entities,
grammatical complexities of Estonian that do not allow simple string matching or entity lists in conflict with each other. For instance, they normalized the grammatical form of a word or excluded certain high false-positive words. They provide seven sets of labels that differ in the noise process. This results in 8 different configurations, when added to the original split with clean labels.
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{tkachenko-etal-2013-named,
title = "Named Entity Recognition in {E}stonian",
author = "Tkachenko, Alexander and
Petmanson, Timo and
Laur, Sven",
booktitle = "Proceedings of the 4th Biennial International Workshop on {B}alto-{S}lavic Natural Language Processing",
month = aug,
year = "2013",
address = "Sofia, Bulgaria",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W13-2412",
pages = "78--83",
}
@article{Hedderich_Zhu_Klakow_2021,
title={Analysing the Noise Model Error for Realistic Noisy Label Data},
author={Hedderich, Michael A. and Zhu, Dawei and Klakow, Dietrich},
volume={35},
url={https://ojs.aaai.org/index.php/AAAI/article/view/16938},
number={9},
journal={Proceedings of the AAAI Conference on Artificial Intelligence},
year={2021},
month={May},
pages={7675-7684},
}
```
### Contributions
Thanks to [@phucdev](https://github.com/phucdev) for adding this dataset. |
timm/oxford-iiit-pet | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': abyssinian
'1': american_bulldog
'2': american_pit_bull_terrier
'3': basset_hound
'4': beagle
'5': bengal
'6': birman
'7': bombay
'8': boxer
'9': british_shorthair
'10': chihuahua
'11': egyptian_mau
'12': english_cocker_spaniel
'13': english_setter
'14': german_shorthaired
'15': great_pyrenees
'16': havanese
'17': japanese_chin
'18': keeshond
'19': leonberger
'20': maine_coon
'21': miniature_pinscher
'22': newfoundland
'23': persian
'24': pomeranian
'25': pug
'26': ragdoll
'27': russian_blue
'28': saint_bernard
'29': samoyed
'30': scottish_terrier
'31': shiba_inu
'32': siamese
'33': sphynx
'34': staffordshire_bull_terrier
'35': wheaten_terrier
'36': yorkshire_terrier
- name: image_id
dtype: string
- name: label_cat_dog
dtype:
class_label:
names:
'0': cat
'1': dog
splits:
- name: train
num_bytes: 376746044.08
num_examples: 3680
- name: test
num_bytes: 426902517.206
num_examples: 3669
download_size: 790265316
dataset_size: 803648561.286
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: cc-by-sa-4.0
size_categories:
- 1K<n<10K
task_categories:
- image-classification
---
# The Oxford-IIIT Pet Dataset
## Description
A 37 category pet dataset with roughly 200 images for each class. The images have a large variations in scale, pose and lighting.
This instance of the dataset uses standard label ordering and includes the standard train/test splits. Trimaps and bbox are not included, but there is an `image_id` field that can be used to reference those annotations from official metadata.
Website: https://www.robots.ox.ac.uk/~vgg/data/pets/
## Citation
```bibtex
@InProceedings{parkhi12a,
author = "Omkar M. Parkhi and Andrea Vedaldi and Andrew Zisserman and C. V. Jawahar",
title = "Cats and Dogs",
booktitle = "IEEE Conference on Computer Vision and Pattern Recognition",
year = "2012",
}
``` |
gachan/Lulav1 | ---
license: openrail
---
|
nid989/EssayFroum-Dataset | ---
license: apache-2.0
---
|
zolak/twitter_dataset_1713001970 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2496607
num_examples: 6154
download_size: 1248218
dataset_size: 2496607
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TinyPixel/s_3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39584126
num_examples: 69374
download_size: 19266010
dataset_size: 39584126
---
# Dataset Card for "s_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MakeThat2/spamchek | ---
license: mit
---
|
Veucci/lyric-to-3genre | ---
license: cc-by-nc-4.0
size_categories:
- 1K<n<10K
task_categories:
- text-classification
language:
- en
tags:
- music
---
# Song Lyrics Dataset
## Description
This dataset contains a collection of song lyrics from various artists and genres in english. It is intended to be used for research, analysis, and other non-commercial purposes.
## Dataset Details
The dataset is organized in a tabular format with the following columns:
- `Genre` (int): Genre of the lyrics
- `Lyrics` (str): The lyrics of the song.
- Pop: 979 rows
- Rock: 995 rows
- Hip-Hop: 1040 rows
## Usage
Feel free to use this dataset for non-commercial purposes such as academic research, natural language processing tasks, sentiment analysis, or personal projects. You are allowed to analyze, modify, and derive insights from the dataset.
If you use this dataset in your work, we kindly request that you provide attribution by citing this repository or linking back to it.
## License
This dataset is released under the Creative Commons Attribution-NonCommercial license. This means that you are not allowed to use the dataset for commercial purposes. For detailed information about the license, please refer to the [LICENSE](./LICENSE) file.
## Contact
If you have any questions, suggestions, or concerns regarding this dataset, please feel free to reach out to email at [efe.ozkan732@gmail.com](mailto:efe.ozkan732@gmail.com).
Happy exploring and analyzing the world of song lyrics!
|
kogi-jwu/bt_jhumaneval | ---
license: mit
configs:
- config_name: default
data_files:
- split: test
path: "bt_jhumaneval.jsonl"
---
# Dataset Card for bt_jhumaneval
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("kogi-jwu/bt_jhumaneval")
```
|
ROIM/temporal-alignment-qa | ---
license: apache-2.0
---
|
nyanko7/pixiv_top50 | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_stsb_here_come | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2770
num_examples: 13
- name: test
num_bytes: 506
num_examples: 4
- name: train
num_bytes: 2332
num_examples: 12
download_size: 12740
dataset_size: 5608
---
# Dataset Card for "MULTI_VALUE_stsb_here_come"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied | ---
pretty_name: Evaluation run of KnutJaegersberg/Qwen-14B-Llamafied
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Qwen-14B-Llamafied](https://huggingface.co/KnutJaegersberg/Qwen-14B-Llamafied)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T19:31:00.889052](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied/blob/main/results_2024-01-13T19-31-00.889052.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576627342855309,\n\
\ \"acc_stderr\": 0.032068617008274285,\n \"acc_norm\": 0.661971967843754,\n\
\ \"acc_norm_stderr\": 0.03270262072736983,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4560156985268844,\n\
\ \"mc2_stderr\": 0.014814301713999594\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.507679180887372,\n \"acc_stderr\": 0.01460966744089257,\n\
\ \"acc_norm\": 0.5520477815699659,\n \"acc_norm_stderr\": 0.014532011498211676\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6353316072495518,\n\
\ \"acc_stderr\": 0.004803533333364225,\n \"acc_norm\": 0.8231428002389962,\n\
\ \"acc_norm_stderr\": 0.003807680331172903\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106134,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.03141082197596241,\n\
\ \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.03141082197596241\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6758620689655173,\n \"acc_stderr\": 0.03900432069185554,\n\
\ \"acc_norm\": 0.6758620689655173,\n \"acc_norm_stderr\": 0.03900432069185554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5502645502645502,\n \"acc_stderr\": 0.02562085704293665,\n \"\
acc_norm\": 0.5502645502645502,\n \"acc_norm_stderr\": 0.02562085704293665\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.02188617856717253,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.02188617856717253\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215286,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678192,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678192\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6897435897435897,\n \"acc_stderr\": 0.02345467488940429,\n \
\ \"acc_norm\": 0.6897435897435897,\n \"acc_norm_stderr\": 0.02345467488940429\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246571,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246571\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n\
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849927,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849927\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530368,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530368\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6225490196078431,\n \"acc_stderr\": 0.03402272044340703,\n \"\
acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.029605103217038332,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.029605103217038332\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875192,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875192\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n\
\ \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n\
\ \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879915,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879915\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472662,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7026143790849673,\n \"acc_stderr\": 0.018492596536396955,\n \
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.018492596536396955\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144912,\n \"mc2\": 0.4560156985268844,\n\
\ \"mc2_stderr\": 0.014814301713999594\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5276724791508719,\n \
\ \"acc_stderr\": 0.013751375538801326\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Qwen-14B-Llamafied
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|arc:challenge|25_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|gsm8k|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hellaswag|10_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T19-31-00.889052.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- '**/details_harness|winogrande|5_2024-01-13T19-31-00.889052.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T19-31-00.889052.parquet'
- config_name: results
data_files:
- split: 2024_01_13T19_31_00.889052
path:
- results_2024-01-13T19-31-00.889052.parquet
- split: latest
path:
- results_2024-01-13T19-31-00.889052.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Qwen-14B-Llamafied
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Qwen-14B-Llamafied](https://huggingface.co/KnutJaegersberg/Qwen-14B-Llamafied) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T19:31:00.889052](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Qwen-14B-Llamafied/blob/main/results_2024-01-13T19-31-00.889052.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576627342855309,
"acc_stderr": 0.032068617008274285,
"acc_norm": 0.661971967843754,
"acc_norm_stderr": 0.03270262072736983,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4560156985268844,
"mc2_stderr": 0.014814301713999594
},
"harness|arc:challenge|25": {
"acc": 0.507679180887372,
"acc_stderr": 0.01460966744089257,
"acc_norm": 0.5520477815699659,
"acc_norm_stderr": 0.014532011498211676
},
"harness|hellaswag|10": {
"acc": 0.6353316072495518,
"acc_stderr": 0.004803533333364225,
"acc_norm": 0.8231428002389962,
"acc_norm_stderr": 0.003807680331172903
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106134,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6758620689655173,
"acc_stderr": 0.03900432069185554,
"acc_norm": 0.6758620689655173,
"acc_norm_stderr": 0.03900432069185554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5502645502645502,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.5502645502645502,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.02188617856717253,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.02188617856717253
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215286,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678192,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678192
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6897435897435897,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.6897435897435897,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246571,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246571
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849927,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849927
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530368,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530368
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038332,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038332
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875192,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875192
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4770949720670391,
"acc_stderr": 0.016704945740326188,
"acc_norm": 0.4770949720670391,
"acc_norm_stderr": 0.016704945740326188
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879915,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879915
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472662,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.018492596536396955,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.018492596536396955
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144912,
"mc2": 0.4560156985268844,
"mc2_stderr": 0.014814301713999594
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
},
"harness|gsm8k|5": {
"acc": 0.5276724791508719,
"acc_stderr": 0.013751375538801326
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
clonandovoz/billiejean | ---
license: openrail
---
|
TrainingDataPro/brain-anomaly-detection | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- image-to-image
- image-segmentation
- object-detection
language:
- en
tags:
- medical
- code
- biology
---
# Brain MRI Dataset, Arnold-Chiari Malformation Detection & Segmentation
The dataset consists of .dcm files containing **MRI scans of the brain** of the person with a Arnold-Chiari Malformation. The images are **labeled** by the doctors and accompanied by **report** in PDF-format.
The dataset includes 6 studies, made from the different angles which provide a comprehensive understanding of a Arnold-Chiari Anomaly and signs of dysplasia of the cranio-vertebral junction (platybasia).
### MRI study angles in the dataset

# 💴 For Commercial Usage: Full version of the dataset includes 100,000 brain studies of people with different conditions, leave a request on **[TrainingData](https://trainingdata.pro/data-market/brain-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=brain-anomaly-detection)** to buy the dataset
### Types of diseases and conditions in the full dataset:
- Cancer
- Multiple sclerosis
- Metastatic lesion
- Arnold-Chiari malformation
- Focal gliosis of the brain
- **AND MANY OTHER CONDITIONS**
.gif?generation=1707995771155527&alt=media)
The dataset holds great value for researchers and medical professionals involved in oncology, radiology, and medical imaging. It can be used for a wide range of purposes, including developing and evaluating novel imaging techniques, training and validating machine learning algorithms for automated tumor detection and segmentation, analyzing tumor response to different treatments, and studying the relationship between imaging features and clinical outcomes.
# 💴 Buy the Dataset: This is just an example of the data. Leave a request on [https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/brain-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=brain-anomaly-detection) to discuss your requirements, learn about the price and buy the dataset
# Content
### The dataset includes:
- **ST000001**: includes subfolders with 6 studies. Each study includes MRI-scans in **.dcm and .jpg formats**,
- **DICOMDIR**: includes information about the patient's condition and links to access files,
- **Brain_MRI_3.pdf**: includes medical report, provided by the radiologist,
- **.csv file**: includes id of the studies and the number of files
### Medical reports include the following data:
- Patient's **demographic information**,
- **Description** of the case,
- Preliminary **diagnosis**,
- **Recommendations** on the further actions
*All patients consented to the publication of data*
# Medical data might be collected in accordance with your requirements.
## [TrainingData](https://trainingdata.pro/data-market/brain-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=brain-anomaly-detection) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
*keywords: mri brain scan, brain tumor, brain cancer, oncology, neuroimaging, radiology, brain metastasis, glioblastoma, meningioma, pituitary tumor, medulloblastoma, astrocytoma, oligodendroglioma, ependymoma, neuro-oncology, brain lesion, brain metastasis detection, brain tumor classification, brain tumor segmentation, brain tumor diagnosis, brain tumor prognosis, brain tumor treatment, brain tumor surgery, brain tumor radiation therapy, brain tumor chemotherapy, brain tumor clinical trials, brain tumor research, brain tumor awareness, brain tumor support, brain tumor survivor, neurosurgery, neurologist, neuroradiology, neuro-oncologist, neuroscientist, medical imaging, cancer detection, cancer segmentation, tumor, computed tomography, head, skull, brain scan, eye sockets, sinuses, computer vision, deep learning* |
rohansolo/ultrachat-200k-sgpt | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train_sft
num_bytes: 1072397668
num_examples: 184780
- name: test_sft
num_bytes: 267066603
num_examples: 46195
download_size: 676751661
dataset_size: 1339464271
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
|
autoevaluate/autoeval-staging-eval-project-banking77-9cb960fa-11435523 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- banking77
eval_info:
task: multi_class_classification
model: nickprock/distilbert-base-uncased-banking77-classification
metrics: []
dataset_name: banking77
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: nickprock/distilbert-base-uncased-banking77-classification
* Dataset: banking77
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nickprock](https://huggingface.co/nickprock) for evaluating this model. |
andersonbcdefg/MEDI-processed-no-instruct-dedup-taskfiltered | ---
dataset_info:
features:
- name: pos
dtype: string
- name: task_name
dtype: string
- name: neg
dtype: string
- name: query
dtype: string
splits:
- name: train
num_bytes: 425167107.0593314
num_examples: 337877
download_size: 321552494
dataset_size: 425167107.0593314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
humantrue/title_translations | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 61587561
num_examples: 9980
download_size: 27343071
dataset_size: 61587561
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MartinDx/v6 | ---
license: mit
---
|
demelin/moral_stories | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- crowdsourced
license:
- mit
multilinguality:
- monolingual
pretty_name: Moral Stories
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- multiple-choice
- text-generation
- text-classification
- commonsense-reasoning
- moral-reasoning
- social-reasoning
task_ids:
- multiple-choice-qa
- language-modeling
- text-scoring
---
# Dataset Card for Moral Stories
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Moral Stories repository](https://github.com/demelin/moral_stories)
- **Repository:** [Moral Stories repository](https://github.com/demelin/moral_stories)
- **Paper:** [Moral Stories: Situated Reasoning about Norms, Intents, Actions, and their Consequences](https://aclanthology.org/2021.emnlp-main.54/)
- **Leaderboard:** [N/A]
- **Point of Contact:** [Denis Emelin](demelin.github.io)
### Dataset Summary
Moral Stories is a crowd-sourced dataset of structured narratives that describe normative and norm-divergent actions taken by individuals to accomplish certain intentions in concrete situations, and their respective consequences. All stories in the dataset consist of seven sentences, belonging to the following categories:
- Norm: A guideline for social conduct generally observed by most people in everyday situations.
- Situation: Setting of the story that introduces story participants and describes their environment.
- Intention: Reasonable goal that one of the story participants (the actor), wants to fulfill.
- Normative action: An action by the actor that fulfills the intention and observes the norm.
- Normative consequence: Possible effect of the normative action on the actor's environment.
- Divergent action: An action by the actor that fulfills the intention and diverges from the norm.
- Divergent consequence: Possible effect of the divergent action on the actor's environment.
Accordingly, each story's constituent sentences can be grouped into three segments. The context segment grounds actions within a particular social scenario, the normative path contains the normative action and its consequence, whereas the divergent path includes their norm-divergent analogues. Combining the context segment separately with each path yields two self-contained sub-stories differing in the adherence of the described events to social expectations. See also [*Section 2* in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
### Supported Tasks and Leaderboards
- commonsense-reasoning / social-reasoning / moral-reasoning: The dataset can also be used evaluate whether pretrained language models can reason about actions that are either consistent or inconsistent with social norms, the consequences of actions, and the norms that may motivate specific action choices. A [BART model](https://huggingface.co/facebook/bart-large) can be used for this purpose.
- text-classification: This dataset can be used to train models to differentiate between normative and divergent actions as well as between plausible and implausible consequences of actions. A [RoBERTa-based model](https://huggingface.co/roberta-base) can be used for this purpose.
- text-generation: The dataset can be used to train models to generate normative / divergent action hypotheses and their consequences, or norms that may explain certain action choices, conditioned on contexts of different scope. A [GPT-based model](https://huggingface.co/EleutherAI/gpt-neo-2.7B) can be used for this purpose.
### Languages
*Moral Stories* is available in English, with mainstream US Englishes being the dominant variety, as indicated by self-reported contributor demographics provided in the [*Ethical Considerations* section of the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
## Dataset Structure
### Data Instances
Due to its highly structured nature, *Moral Stories* enables a variety of tasks, such as action classification or norm generation. Furthermore, we provide different data splits in an attempt to challenge generalization abilities of the evaluated models. For details, refer to [*Section 2* in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf). Here, we replicate one instance from the full, task-agnostic dataset, one instance from a classification task dataset (cls-action+norm), and one instance from a generation task dataset (gen-consequence$action+context):
"full" instance:
{"ID": "3OJSZ2ATDTUMG7NGR28931AHDQF57F",
"norm": "It's kind to wait for people.",
"situation": "Josh drives Penelope to work every day and is hanging outside her house to pick her up, but there's no sign of her.",
"intention": "Josh needs to get to work.",
"moral_action": "Josh continues to sit outside Penelope's house until she walks out the door a few minutes later.",
"moral_consequence": "Penelope appreciates Josh's patience and thanks him for waiting around.",
"immoral_action": "Josh shrugs his shoulders and drives away to work after sitting around a moment.",
"immoral_consequence": "Penelope comes running out of the house shouting to wait, but she's too late."}
positive and negative "cls-action+norm" instances (i.e. classify an action as normative or divergent given the action and a relevant norm):
{"ID": "3EF8EXOTT2TGCKNZROVU3129RYNJ1G1",
"norm": "It's bad to be too loud and wake people up.",
"moral_action": "Jordan quietly makes himself a bowl of cereal for breakfast.",
"label": "1"}
{"ID": "3EF8EXOTT2TGCKNZROVU3129RYNJ1G0",
"norm": "It's bad to be too loud and wake people up.",
"immoral_action": "Jordan noisily bangs a lot of pots and pans when making his breakfast.",
"label": "0"}
"gen-consequence$action+context" instance (i.e. generate either a plausible or implausible consequence given an action and its context):
{"ID": "33FOTY3KENJT46HTHWE8GXPE3LE1C03",
"norm": "It's mean to hit a bear with a guitar.",
"situation": "Joe was camping in Montana with his band when he came across a black bear.",
"intention": "Joe wanted to scare away the black bear.",
"immoral_action": "Joe grabbed his guitarist's guitar and hit the bear with it.",
"immoral_consequence": "The bear suffers a concussion and doesn't understand why he was hit.",
"label": "1"}
### Data Fields
- "ID": Unique identifier ID for this dataset instance.
- "norm": A guideline for social conduct generally observed by most people in everyday situations.
- "situation": Setting of the story that introduces story participants and describes their environment.
- "intention": Reasonable goal that one of the story participants (the actor), wants to fulfill.
- "moral_(i.e. 'normative')_action": An action by the actor that fulfills the intention and observes the norm.
- "moral_consequence": Possible effect of the normative action on the actor's environment.
- "immoral_(i.e. 'divergent')_action": An action by the actor that fulfills the intention and diverges from the norm.
- "immoral_consequence": Possible effect of the divergent action on the actor's environment.
- "label": Data instance label; for action-related tasks, "0" corresponds to an immoral / divergent action while "1" corresponds to a moral / normative action, for consequence-related tasks, "0" corresponds to a plausible consequence while "1" corresponds to an implausible consequence (for generation tasks, label is always set to "1")
### Data Splits
For classification tasks, we examined three data split strategies:
- *Norm Distance*: Norms are based on social consensus and may, as such, change across time and between locations. Therefore, we are also interested in how well classification models can generalize to novel norms. To estimate this, we split the dataset by embedding
norms found in the collected stories and grouping them into 1k clusters via agglomerative clustering. Clusters are ordered according to their degree of isolation, defined as the cosine distance between a cluster's centroid and the next-closest cluster's centroid. Stories with norms from most isolated clusters are assigned to test and development sets, with the rest forming the training set.
- *Lexical Bias*: Tests the susceptibility of classifiers to surface-level lexical correlations. We first identify 100 biased lemmas that occur most frequently either in normative or divergent actions. Each story is then assigned a bias score corresponding to the total number of biased lemmas present in both actions (or consequences). Starting with the lowest bias scores, stories are assigned to the test, development, and, lastly, training set.
- *Minimal Pairs*: Evaluates the model's ability to perform nuanced social reasoning. Splits are obtained by ordering stories according to the Damerau-Levenshtein distance between their actions (or consequences) and assigning stories with lowest distances to the test set, followed by the development set. The remainder makes up the training set.
For generation tasks, only the *Norm Distance* split strategy is used. For more details, refer to [*Section 3* and *Appendix C* in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
## Dataset Creation
### Curation Rationale
Please refer to [*Section 2* and the *Ethical Considerations* section in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
### Source Data
#### Initial Data Collection and Normalization
Please refer to [*Section 2* in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
#### Who are the source language producers?
Please refer to [the *Ethical Considerations* section in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
### Annotations
#### Annotation process
Please refer to [*Section 2* and the *Ethical Considerations* section in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
#### Who are the annotators?
Please refer to [the *Ethical Considerations* section in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
Please refer to [the *Ethical Considerations* section in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
### Discussion of Biases
Please refer to [the *Ethical Considerations* section in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
### Other Known Limitations
Please refer to [the *Ethical Considerations* section in the dataset paper](https://aclanthology.org/2021.emnlp-main.54.pdf).
## Additional Information
### Dataset Curators
[Denis Emelin](demelin.github.io)
### Licensing Information
MIT
### Citation Information
@article{Emelin2021MoralSS,
title={Moral Stories: Situated Reasoning about Norms, Intents, Actions, and their Consequences},
author={Denis Emelin and Ronan Le Bras and Jena D. Hwang and Maxwell Forbes and Yejin Choi},
journal={ArXiv},
year={2021},
volume={abs/2012.15738}
} |
dferndz/cSQuAD2 | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- other
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: cSQuAD2
size_categories: []
source_datasets: []
tags: []
task_categories:
- question-answering
task_ids: []
---
# Dataset Card for cSQuAD2
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
A contrast set to evaluate models trained on SQUAD on out-of-domain data.
### Supported Tasks
Evaluate question-answering
### Languages
English
## Dataset Structure
### Data Instances
Dataset contains 40 instances
### Data Fields
| Field | Description |
|----------|--------------------------------------------------
| id | Id of document containing context |
| title | Title of the document |
| context | The context of the question |
| question | The question to answer |
| answers | A list of possible answers from the context |
| answer_start | The index in context where the answer starts |
### Data Splits
A single `test` split is provided
## Dataset Creation
Dataset was created from Wikipedia articles
## Additional Information
### Licensing Information
Apache 2.0 license
### Citation Information
TODO: add citations |
mHossain/final_train_v1_380000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 11599066.8
num_examples: 27000
- name: test
num_bytes: 1288785.2
num_examples: 3000
download_size: 5635427
dataset_size: 12887852.0
---
# Dataset Card for "final_train_v1_380000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZiHDeng/hf-ny8-v3 | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3970704
num_examples: 8873
download_size: 735717
dataset_size: 3970704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
commaai/comma2k19 | ---
license: mit
---
# comma2k19
[comma.ai](https://comma.ai) presents comma2k19, a dataset of over 33 hours of commute in California's 280 highway. This means 2019 segments, 1 minute long each, on a 20km section of highway driving between California's San Jose and San Francisco. comma2k19 is a fully reproducible and scalable dataset. The data was collected using comma [EONs](https://comma.ai/shop/products/eon-gold-dashcam-devkit/) that has sensors similar to those of any modern smartphone including a road-facing camera, phone GPS, thermometers and 9-axis IMU. Additionally, the EON captures raw GNSS measurements and all CAN data sent by the car with a comma [grey panda](https://comma.ai/shop/products/panda-obd-ii-dongle/).
<img src="https://github.com/commaai/comma2k19/blob/master/assets/testmesh3d.png?raw=true"/>
Here we also introduced [Laika](https://github.com/commaai/laika), an open-source GNSS processing library. Laika produces 40% more accurate positions than the GNSS module used to collect the raw data. This dataset includes pose (position + orientation) estimates in a global reference frame of the recording camera. These poses were computed with a tightly coupled INS/GNSS/Vision optimizer that relies on data processed by Laika. comma2k19 is ideal for development and validation of tightly coupled GNSS algorithms and mapping algorithms that work with commodity sensors.
<img src="https://github.com/commaai/comma2k19/blob/master/assets/merged.png?raw=true"/>
## Publication
For a detailed write-up about this dataset, please refer to our [paper](https://arxiv.org/abs/1812.05752v1). If you use comma2k19 or Laika in your research, please consider citing
```text
@misc{1812.05752,
Author = {Harald Schafer and Eder Santana and Andrew Haden and Riccardo Biasini},
Title = {A Commute in Data: The comma2k19 Dataset},
Year = {2018},
Eprint = {arXiv:1812.05752},
}
```
|
salmasally/first | ---
license: bigcode-openrail-m
---
|
arbml/PADIC | ---
dataset_info:
features:
- name: ALGIERS
dtype: string
- name: ANNABA
dtype: string
- name: MODERN-STANDARD-ARABIC
dtype: string
- name: SYRIAN
dtype: string
- name: PALESTINIAN
dtype: string
splits:
- name: train
num_bytes: 1381043
num_examples: 7213
download_size: 848313
dataset_size: 1381043
---
# Dataset Card for "PADIC"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazi-ali/llama_2-product-titles-esci-sft-test | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: text
dtype: string
- name: label
dtype: string
- name: preds
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: max_score
dtype: float64
- name: min_score
dtype: float64
- name: best_title
dtype: string
- name: clean_preds
dtype: string
- name: new_score
dtype: float64
- name: good_pred
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1656074.0
num_examples: 1629
download_size: 889043
dataset_size: 1656074.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-product-titles-esci-sft-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/snli-NOB | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 63889357
num_examples: 550152
- name: validation
num_bytes: 1225673
num_examples: 10000
- name: test
num_bytes: 1218676
num_examples: 10000
download_size: 19990084
dataset_size: 66333706
license: cc-by-4.0
---
# Dataset Card for "snli-NOB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ixarchakos/laydown_pairs | ---
license: bsd
tags:
- code
pretty_name: laydown
size_categories:
- 10K<n<100K
--- |
appvoid/no-prompt-50k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 126807704
num_examples: 50000
download_size: 67249109
dataset_size: 126807704
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713142541 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9728
num_examples: 27
download_size: 11919
dataset_size: 9728
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713142541"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hoanganhknk/zalo | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: temp
dtype: string
splits:
- name: train
num_bytes: 71711.0
num_examples: 1
download_size: 72626
dataset_size: 71711.0
---
# Dataset Card for "zalo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b | ---
pretty_name: Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b](https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T11:52:44.737465](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b/blob/main/results_2023-10-22T11-52-44.737465.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.00045666764626670005,\n \"f1\": 0.05511640100671131,\n\
\ \"f1_stderr\": 0.0012812534382648734,\n \"acc\": 0.46045352575705806,\n\
\ \"acc_stderr\": 0.01174889042363714\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626670005,\n\
\ \"f1\": 0.05511640100671131,\n \"f1_stderr\": 0.0012812534382648734\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19636087945413191,\n \
\ \"acc_stderr\": 0.010942090791564744\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7245461720599842,\n \"acc_stderr\": 0.012555690055709537\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|arc:challenge|25_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T11_52_44.737465
path:
- '**/details_harness|drop|3_2023-10-22T11-52-44.737465.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T11-52-44.737465.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T11_52_44.737465
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-52-44.737465.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-52-44.737465.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hellaswag|10_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T11_52_44.737465
path:
- '**/details_harness|winogrande|5_2023-10-22T11-52-44.737465.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T11-52-44.737465.parquet'
- config_name: results
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- results_2023-09-05T09:02:22.331640.parquet
- split: 2023_10_22T11_52_44.737465
path:
- results_2023-10-22T11-52-44.737465.parquet
- split: latest
path:
- results_2023-10-22T11-52-44.737465.parquet
---
# Dataset Card for Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b](https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T11:52:44.737465](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b/blob/main/results_2023-10-22T11-52-44.737465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670005,
"f1": 0.05511640100671131,
"f1_stderr": 0.0012812534382648734,
"acc": 0.46045352575705806,
"acc_stderr": 0.01174889042363714
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670005,
"f1": 0.05511640100671131,
"f1_stderr": 0.0012812534382648734
},
"harness|gsm8k|5": {
"acc": 0.19636087945413191,
"acc_stderr": 0.010942090791564744
},
"harness|winogrande|5": {
"acc": 0.7245461720599842,
"acc_stderr": 0.012555690055709537
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdapterOcean/med_alpaca_standardized_cluster_72 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 67345996
num_examples: 7203
download_size: 18895951
dataset_size: 67345996
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_72"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Arthuerwang/finetune_lora_pikachu | ---
dataset_info:
features:
- name: image
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 922
num_examples: 5
download_size: 2618
dataset_size: 922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iacolabmat/Futbol | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5555.105263157895
num_examples: 26
- name: test
num_bytes: 2563.8947368421054
num_examples: 12
download_size: 7445
dataset_size: 8119.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
RENILSON/clonar | ---
license: openrail
---
|
Yorth/dalleTestDataFiltered | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: resolution
dtype: string
splits:
- name: train
num_bytes: 1177555104.0891273
num_examples: 4584
download_size: 1216930750
dataset_size: 1177555104.0891273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
clairempare/emoji-faces-labeled-emotions | ---
license: apache-2.0
---
|
varun4/AdventureTimeCaptions | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 62319.0
num_examples: 3
download_size: 58529
dataset_size: 62319.0
---
# Dataset Card for "AdventureTimeCaptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/shouhou_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shouhou/祥鳳 (Kantai Collection)
This is the dataset of shouhou/祥鳳 (Kantai Collection), containing 463 images and their tags.
The core tags of this character are `long_hair, black_hair, brown_eyes, ahoge, breasts, low-tied_long_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 463 | 376.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shouhou_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 463 | 279.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shouhou_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 988 | 533.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shouhou_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 463 | 357.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shouhou_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 988 | 645.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shouhou_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shouhou_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bandeau, cleavage, collarbone, hadanugi_dousa, medium_breasts, open_kimono, solo, hair_ribbon, simple_background, gloves, looking_at_viewer, pleated_skirt, white_background, black_skirt, smile, blush, brown_hair, twitter_username |
| 1 | 6 |  |  |  |  |  | 1girl, bandeau, bow_(weapon), hadanugi_dousa, hair_ribbon, open_kimono, solo, cleavage, medium_breasts, partially_fingerless_gloves, smile, black_skirt, pleated_skirt, single_glove |
| 2 | 6 |  |  |  |  |  | 1girl, bandeau, blush, hadanugi_dousa, open_kimono, solo, upper_body, simple_background, white_background, smile, cleavage |
| 3 | 13 |  |  |  |  |  | 1girl, pleated_skirt, bandeau, black_skirt, hakama_short_skirt, simple_background, solo, white_background, black_hakama, hadanugi_dousa, open_kimono, smile, blush, thighhighs, white_kimono, dated |
| 4 | 12 |  |  |  |  |  | 1girl, smile, solo, oil-paper_umbrella, looking_at_viewer, kimono, skirt, rain, holding_umbrella, blush |
| 5 | 5 |  |  |  |  |  | 2girls, simple_background, white_background, black_hakama, black_skirt, blush, hachimaki, hakama_skirt, open_mouth, thighhighs, brown_hair, high_ponytail, holding, wide_sleeves |
| 6 | 6 |  |  |  |  |  | 2girls, hachimaki, high_ponytail, simple_background, white_background, japanese_clothes, skirt, |_|, aged_down, blush, shorts, wide_sleeves |
| 7 | 13 |  |  |  |  |  | 1girl, solo, bikini, navel, looking_at_viewer, smile, white_background, blush, simple_background, cowboy_shot, large_breasts, open_mouth, collarbone, medium_breasts |
| 8 | 8 |  |  |  |  |  | 1girl, solo, black_skirt, cowboy_shot, long_sleeves, pleated_skirt, serafuku, simple_background, white_background, alternate_costume, looking_at_viewer, neckerchief, black_sailor_collar, black_thighhighs, white_shirt |
| 9 | 7 |  |  |  |  |  | enmaided, 1girl, black_dress, looking_at_viewer, solo, white_apron, maid_apron, maid_headdress, cowboy_shot, dated, long_sleeves, frilled_apron, one-hour_drawing_challenge, puffy_sleeves, simple_background, smile, thighhighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bandeau | cleavage | collarbone | hadanugi_dousa | medium_breasts | open_kimono | solo | hair_ribbon | simple_background | gloves | looking_at_viewer | pleated_skirt | white_background | black_skirt | smile | blush | brown_hair | twitter_username | bow_(weapon) | partially_fingerless_gloves | single_glove | upper_body | hakama_short_skirt | black_hakama | thighhighs | white_kimono | dated | oil-paper_umbrella | kimono | skirt | rain | holding_umbrella | 2girls | hachimaki | hakama_skirt | open_mouth | high_ponytail | holding | wide_sleeves | japanese_clothes | |_| | aged_down | shorts | bikini | navel | cowboy_shot | large_breasts | long_sleeves | serafuku | alternate_costume | neckerchief | black_sailor_collar | black_thighhighs | white_shirt | enmaided | black_dress | white_apron | maid_apron | maid_headdress | frilled_apron | one-hour_drawing_challenge | puffy_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-----------|:-------------|:-----------------|:-----------------|:--------------|:-------|:--------------|:--------------------|:---------|:--------------------|:----------------|:-------------------|:--------------|:--------|:--------|:-------------|:-------------------|:---------------|:------------------------------|:---------------|:-------------|:---------------------|:---------------|:-------------|:---------------|:--------|:---------------------|:---------|:--------|:-------|:-------------------|:---------|:------------|:---------------|:-------------|:----------------|:----------|:---------------|:-------------------|:------|:------------|:---------|:---------|:--------|:--------------|:----------------|:---------------|:-----------|:--------------------|:--------------|:----------------------|:-------------------|:--------------|:-----------|:--------------|:--------------|:-------------|:-----------------|:----------------|:-----------------------------|:----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | X | X | X | X | | | | X | | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | X | | X | X | | X | | | | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | X | | | X | | X | X | | X | | | X | X | X | X | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | | | | | | | X | | | | X | | | | X | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | | | | | | | | | X | | | | X | X | | X | X | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | X | | | X | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 7 | 13 |  |  |  |  |  | X | | | X | | X | | X | | X | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | | | | | | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | | | | | X | | X | | X | | X | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_ehartford__dolphin-2.2-70b | ---
pretty_name: Evaluation run of ehartford/dolphin-2.2-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/dolphin-2.2-70b](https://huggingface.co/ehartford/dolphin-2.2-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__dolphin-2.2-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-08T15:41:29.981879](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-70b/blob/main/results_2023-12-08T15-41-29.981879.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6908692958524526,\n\
\ \"acc_stderr\": 0.030565557295291628,\n \"acc_norm\": 0.6948282125429526,\n\
\ \"acc_norm_stderr\": 0.031156142381821045,\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.6013577707139347,\n\
\ \"mc2_stderr\": 0.014840096510342907\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6569965870307167,\n \"acc_stderr\": 0.013872423223718166,\n\
\ \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.01338502163731357\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.668990240987851,\n\
\ \"acc_stderr\": 0.004696148339570979,\n \"acc_norm\": 0.8596893049193388,\n\
\ \"acc_norm_stderr\": 0.0034659928816107746\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.03068302084323101,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.03068302084323101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8419354838709677,\n \"acc_stderr\": 0.020752831511875278,\n\
\ \"acc_norm\": 0.8419354838709677,\n \"acc_norm_stderr\": 0.020752831511875278\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.01932180555722313,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.01932180555722313\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377272,\n \
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.02772206549336126,\n \
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.02772206549336126\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8862385321100917,\n \"acc_stderr\": 0.013613614800232808,\n \"\
acc_norm\": 0.8862385321100917,\n \"acc_norm_stderr\": 0.013613614800232808\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878467,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878467\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\
\ \"acc_stderr\": 0.028188240046929196,\n \"acc_norm\": 0.7713004484304933,\n\
\ \"acc_norm_stderr\": 0.028188240046929196\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.03019482399680448,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.03019482399680448\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371033,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371033\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623792,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623792\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8492975734355045,\n\
\ \"acc_stderr\": 0.01279342088312082,\n \"acc_norm\": 0.8492975734355045,\n\
\ \"acc_norm_stderr\": 0.01279342088312082\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.021393961404363847,\n\
\ \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.021393961404363847\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.587709497206704,\n\
\ \"acc_stderr\": 0.016463200238114515,\n \"acc_norm\": 0.587709497206704,\n\
\ \"acc_norm_stderr\": 0.016463200238114515\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.0216138093952248,\n\
\ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.0216138093952248\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291477,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7434640522875817,\n \"acc_stderr\": 0.017667841612379005,\n \
\ \"acc_norm\": 0.7434640522875817,\n \"acc_norm_stderr\": 0.017667841612379005\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n\
\ \"acc_stderr\": 0.040693063197213754,\n \"acc_norm\": 0.7636363636363637,\n\
\ \"acc_norm_stderr\": 0.040693063197213754\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.02635891633490402,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.02635891633490402\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42472460220318237,\n\
\ \"mc1_stderr\": 0.01730400095716748,\n \"mc2\": 0.6013577707139347,\n\
\ \"mc2_stderr\": 0.014840096510342907\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5678544351781653,\n \
\ \"acc_stderr\": 0.013645072137842443\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/dolphin-2.2-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|arc:challenge|25_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|gsm8k|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hellaswag|10_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T15-41-29.981879.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- '**/details_harness|winogrande|5_2023-12-08T15-41-29.981879.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-08T15-41-29.981879.parquet'
- config_name: results
data_files:
- split: 2023_12_08T15_41_29.981879
path:
- results_2023-12-08T15-41-29.981879.parquet
- split: latest
path:
- results_2023-12-08T15-41-29.981879.parquet
---
# Dataset Card for Evaluation run of ehartford/dolphin-2.2-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/dolphin-2.2-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/dolphin-2.2-70b](https://huggingface.co/ehartford/dolphin-2.2-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__dolphin-2.2-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T15:41:29.981879](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__dolphin-2.2-70b/blob/main/results_2023-12-08T15-41-29.981879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6908692958524526,
"acc_stderr": 0.030565557295291628,
"acc_norm": 0.6948282125429526,
"acc_norm_stderr": 0.031156142381821045,
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.6013577707139347,
"mc2_stderr": 0.014840096510342907
},
"harness|arc:challenge|25": {
"acc": 0.6569965870307167,
"acc_stderr": 0.013872423223718166,
"acc_norm": 0.7005119453924915,
"acc_norm_stderr": 0.01338502163731357
},
"harness|hellaswag|10": {
"acc": 0.668990240987851,
"acc_stderr": 0.004696148339570979,
"acc_norm": 0.8596893049193388,
"acc_norm_stderr": 0.0034659928816107746
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.03068302084323101,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.03068302084323101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8419354838709677,
"acc_stderr": 0.020752831511875278,
"acc_norm": 0.8419354838709677,
"acc_norm_stderr": 0.020752831511875278
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.01932180555722313,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.01932180555722313
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.02772206549336126,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.02772206549336126
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8862385321100917,
"acc_stderr": 0.013613614800232808,
"acc_norm": 0.8862385321100917,
"acc_norm_stderr": 0.013613614800232808
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878467,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878467
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929196,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929196
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.03019482399680448,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.03019482399680448
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.029634717272371033,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.029634717272371033
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623792,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623792
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8492975734355045,
"acc_stderr": 0.01279342088312082,
"acc_norm": 0.8492975734355045,
"acc_norm_stderr": 0.01279342088312082
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.021393961404363847,
"acc_norm": 0.8034682080924855,
"acc_norm_stderr": 0.021393961404363847
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.587709497206704,
"acc_stderr": 0.016463200238114515,
"acc_norm": 0.587709497206704,
"acc_norm_stderr": 0.016463200238114515
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.02418515064781871,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.02418515064781871
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.0216138093952248,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.0216138093952248
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5430247718383312,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.5430247718383312,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7434640522875817,
"acc_stderr": 0.017667841612379005,
"acc_norm": 0.7434640522875817,
"acc_norm_stderr": 0.017667841612379005
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.040693063197213754,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.040693063197213754
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.02635891633490402,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.02635891633490402
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42472460220318237,
"mc1_stderr": 0.01730400095716748,
"mc2": 0.6013577707139347,
"mc2_stderr": 0.014840096510342907
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
},
"harness|gsm8k|5": {
"acc": 0.5678544351781653,
"acc_stderr": 0.013645072137842443
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BambiMC/ts_train | ---
license: mit
---
|
rinabuoy/Eng-Khmer-Agg | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 29660480
num_examples: 75292
- name: test
num_bytes: 2619029
num_examples: 5911
download_size: 12006160
dataset_size: 32279509
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
abdiharyadi/yelp-juncenli-sampled-opus-translated | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: original_text
dtype: string
splits:
- name: train
num_bytes: 44556
num_examples: 445
- name: validation
num_bytes: 479
num_examples: 4
- name: test
num_bytes: 1052
num_examples: 10
download_size: 33531
dataset_size: 46087
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
todi1/pasmr1 | ---
license: openrail
---
|
HydraLM/partitioned_v3_standardized_00 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 17506989.303956702
num_examples: 32558
download_size: 12459472
dataset_size: 17506989.303956702
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_00"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alessandrogd/Giovanelabs | ---
license: openrail
---
|
rai-sandeep/test_ds | ---
dataset_info:
features:
- name: category
dtype: string
- name: topic
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 2082
num_examples: 4
download_size: 6582
dataset_size: 2082
---
# Dataset Card for "test_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_78_1713213385 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2267234
num_examples: 5414
download_size: 1125811
dataset_size: 2267234
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cbalaji/gen_solution_postings | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Description
dtype: string
- name: Posting
dtype: string
splits:
- name: train
num_bytes: 50174
num_examples: 12
download_size: 53513
dataset_size: 50174
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gen_solution_postings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rdp-studio/paimon-voice | ---
license: cc-by-nc-sa-4.0
---
This dataset is uploading. |
allenai/scitail | ---
language:
- en
paperswithcode_id: scitail
pretty_name: SciTail
dataset_info:
- config_name: dgem_format
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: hypothesis_graph_structure
dtype: string
splits:
- name: train
num_bytes: 6817626
num_examples: 23088
- name: test
num_bytes: 606867
num_examples: 2126
- name: validation
num_bytes: 393209
num_examples: 1304
download_size: 2007018
dataset_size: 7817702
- config_name: predictor_format
features:
- name: answer
dtype: string
- name: sentence2_structure
dtype: string
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: gold_label
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 8864108
num_examples: 23587
- name: test
num_bytes: 795275
num_examples: 2126
- name: validation
num_bytes: 510140
num_examples: 1304
download_size: 2169238
dataset_size: 10169523
- config_name: snli_format
features:
- name: sentence1_binary_parse
dtype: string
- name: sentence1_parse
dtype: string
- name: sentence1
dtype: string
- name: sentence2_parse
dtype: string
- name: sentence2
dtype: string
- name: annotator_labels
sequence: string
- name: gold_label
dtype: string
splits:
- name: train
num_bytes: 22457379
num_examples: 23596
- name: test
num_bytes: 2005142
num_examples: 2126
- name: validation
num_bytes: 1264378
num_examples: 1304
download_size: 7476483
dataset_size: 25726899
- config_name: tsv_format
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 4606527
num_examples: 23097
- name: test
num_bytes: 410267
num_examples: 2126
- name: validation
num_bytes: 260422
num_examples: 1304
download_size: 1836546
dataset_size: 5277216
configs:
- config_name: dgem_format
data_files:
- split: train
path: dgem_format/train-*
- split: test
path: dgem_format/test-*
- split: validation
path: dgem_format/validation-*
- config_name: predictor_format
data_files:
- split: train
path: predictor_format/train-*
- split: test
path: predictor_format/test-*
- split: validation
path: predictor_format/validation-*
- config_name: snli_format
data_files:
- split: train
path: snli_format/train-*
- split: test
path: snli_format/test-*
- split: validation
path: snli_format/validation-*
- config_name: tsv_format
data_files:
- split: train
path: tsv_format/train-*
- split: test
path: tsv_format/test-*
- split: validation
path: tsv_format/validation-*
---
# Dataset Card for "scitail"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://allenai.org/data/scitail](https://allenai.org/data/scitail)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 56.70 MB
- **Size of the generated dataset:** 49.09 MB
- **Total amount of disk used:** 105.79 MB
### Dataset Summary
The SciTail dataset is an entailment dataset created from multiple-choice science exams and web sentences. Each question
and the correct answer choice are converted into an assertive statement to form the hypothesis. We use information
retrieval to obtain relevant text from a large text corpus of web sentences, and use these sentences as a premise P. We
crowdsource the annotation of such premise-hypothesis pair as supports (entails) or not (neutral), in order to create
the SciTail dataset. The dataset contains 27,026 examples with 10,101 examples with entails label and 16,925 examples
with neutral label
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### dgem_format
- **Size of downloaded dataset files:** 14.18 MB
- **Size of the generated dataset:** 7.83 MB
- **Total amount of disk used:** 22.01 MB
An example of 'train' looks as follows.
```
```
#### predictor_format
- **Size of downloaded dataset files:** 14.18 MB
- **Size of the generated dataset:** 10.19 MB
- **Total amount of disk used:** 24.37 MB
An example of 'validation' looks as follows.
```
```
#### snli_format
- **Size of downloaded dataset files:** 14.18 MB
- **Size of the generated dataset:** 25.77 MB
- **Total amount of disk used:** 39.95 MB
An example of 'validation' looks as follows.
```
```
#### tsv_format
- **Size of downloaded dataset files:** 14.18 MB
- **Size of the generated dataset:** 5.30 MB
- **Total amount of disk used:** 19.46 MB
An example of 'validation' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### dgem_format
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a `string` feature.
- `hypothesis_graph_structure`: a `string` feature.
#### predictor_format
- `answer`: a `string` feature.
- `sentence2_structure`: a `string` feature.
- `sentence1`: a `string` feature.
- `sentence2`: a `string` feature.
- `gold_label`: a `string` feature.
- `question`: a `string` feature.
#### snli_format
- `sentence1_binary_parse`: a `string` feature.
- `sentence1_parse`: a `string` feature.
- `sentence1`: a `string` feature.
- `sentence2_parse`: a `string` feature.
- `sentence2`: a `string` feature.
- `annotator_labels`: a `list` of `string` features.
- `gold_label`: a `string` feature.
#### tsv_format
- `premise`: a `string` feature.
- `hypothesis`: a `string` feature.
- `label`: a `string` feature.
### Data Splits
| name |train|validation|test|
|----------------|----:|---------:|---:|
|dgem_format |23088| 1304|2126|
|predictor_format|23587| 1304|2126|
|snli_format |23596| 1304|2126|
|tsv_format |23097| 1304|2126|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
inproceedings{scitail,
Author = {Tushar Khot and Ashish Sabharwal and Peter Clark},
Booktitle = {AAAI},
Title = {{SciTail}: A Textual Entailment Dataset from Science Question Answering},
Year = {2018}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham), [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
heliosprime/twitter_dataset_1713065060 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 14001
num_examples: 31
download_size: 9920
dataset_size: 14001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713065060"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andreotte/multi-label-classification-test | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
0: Door
1: Eaves
2: Gutter
3: Vegetation
4: Vent
5: Window
- name: pixel_values
dtype: image
splits:
- name: test
num_bytes: 9476052.0
num_examples: 151
- name: train
num_bytes: 82422534.7
num_examples: 1315
download_size: 91894615
dataset_size: 91898586.7
---
# Dataset Card for "multi-label-classification-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DIAS123/Ford | ---
license: openrail
---
|
amphora/regional_qa | ---
configs:
- config_name: presidents
data_files:
- split: test
path: rqa_presidents.csv
---
|
kostasGRG/greek-twitter-multimodal-dataset | ---
task_categories:
- text-classification
- image-classification
language:
- el
size_categories:
- n<1K
---
This is a dataset for sentiment analysis created by text-image pairs collected from greek Twitter.
posted: from April 2023 to September 2023
Context: general purpose, mostly politics and athletics
Total pairs: 260
The purpose of the dataset is to be used as a test dataset, not for training/fine-tuning a model.
Labels: Negative, Neutral, Positive
labelling.xlsx contains the labels for texts,images and both modalities. |
Pablao0948/Milhouse | ---
license: openrail
---
|
HuggingFaceGECLM/data_feedback | ---
license: openrail
---
|
abdur75648/UTRSet-Synth | ---
title: UrduSet-Synth (UTRNet)
emoji: 📖
colorFrom: red
colorTo: green
license: cc-by-nc-4.0
task_categories:
- image-to-text
language:
- ur
tags:
- ocr
- text recognition
- urdu-ocr
- utrnet
pretty_name: UTRSet-Synth
references:
- https://github.com/abdur75648/UTRNet-High-Resolution-Urdu-Text-Recognition
- https://abdur75648.github.io/UTRNet/
- https://arxiv.org/abs/2306.15782
---
The **UTRSet-Synth** dataset is introduced as a complementary training resource to the [**UTRSet-Real** Dataset](https://paperswithcode.com/dataset/utrset-real), specifically designed to enhance the effectiveness of Urdu OCR models. It is a high-quality synthetic dataset comprising 20,000 lines that closely resemble real-world representations of Urdu text.
To generate the dataset, a custom-designed synthetic data generation module which offers precise control over variations in crucial factors such as font, text size, colour, resolution, orientation, noise, style, and background, was employed. Moreover, the UTRSet-Synth dataset tackles the limitations observed in existing datasets. It addresses the challenge of standardizing fonts by incorporating over 130 diverse Urdu fonts, which were thoroughly refined to ensure consistent rendering schemes. It overcomes the scarcity of Arabic words, numerals, and Urdu digits by incorporating a significant number of samples representing these elements. Additionally, the dataset is enriched by randomly selecting words from a vocabulary of 100,000 words during the text generation process. As a result, UTRSet-Synth contains a total of 28,187 unique words, with an average word length of 7 characters.
The availability of the UTRSet-Synth dataset, a synthetic dataset that closely emulates real-world variations, addresses the scarcity of comprehensive real-world printed Urdu OCR datasets. By providing researchers with a valuable resource for developing and benchmarking Urdu OCR models, this dataset promotes standardized evaluation, and reproducibility, and fosters advancements in the field of Urdu OCR. For more information and details about the [UTRSet-Real](https://paperswithcode.com/dataset/utrset-real) & [UTRSet-Synth](https://paperswithcode.com/dataset/utrset-synth) datasets, please refer to the paper ["UTRNet: High-Resolution Urdu Text Recognition In Printed Documents"](https://arxiv.org/abs/2306.15782) |
VLyb/YAGO3-10 | ---
license: unlicense
---
|
OGB/ogbg-code2 | ---
license: mit
task_categories:
- graph-ml
---
# Dataset Card for ogbg-code2
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Homepage](https://ogb.stanford.edu/docs/graphprop/#ogbg-code2)**
- **[Repository](https://github.com/snap-stanford/ogb):**:
- **Paper:**: Open Graph Benchmark: Datasets for Machine Learning on Graphs (see citation)
- **Leaderboard:**: [OGB leaderboard](https://ogb.stanford.edu/docs/leader_graphprop/#ogbg-code2) and [Papers with code leaderboard](https://paperswithcode.com/sota/graph-property-prediction-on-ogbg-code2)
### Dataset Summary
The `ogbg-code2` dataset contains Abstract Syntax Trees (ASTs) obtained from 450 thousands Python method definitions, from GitHub CodeSearchNet. "Methods are extracted from a total of 13,587 different repositories across the most popular projects on GitHub.", by teams at Stanford, to be a part of the Open Graph Benchmark. See their website or paper for dataset postprocessing.
### Supported Tasks and Leaderboards
"The task is to predict the sub-tokens forming the method name, given the Python method body represented by AST and its node features. This task is often referred to as “code summarization”, because the model is trained to find succinct and precise description for a complete logical unit."
The score is the F1 score of sub-token prediction.
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
graphs_dataset = load_dataset("graphs-datasets/ogbg-code2)
# For the train set (replace by valid or test as needed)
graphs_list = [Data(graph) for graph in graphs_dataset["train"]]
graphs_pygeometric = DataLoader(graph_list)
```
## Dataset Structure
### Data Properties
| property | value |
|---|---|
| scale | medium |
| #graphs | 452,741 |
| average #nodes | 125.2 |
| average #edges | 124.2 |
| average node degree | 2.0 |
| average cluster coefficient | 0.0 |
| MaxSCC ratio | 1.000 |
| graph diameter | 13.5 |
### Data Fields
Each row of a given file is a graph, with:
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `edge_feat` (list: #edges x #edge-features): features of edges
- `node_feat` (list: #nodes x #node-features): the nodes features, embedded
- `node_feat_expanded` (list: #nodes x #node-features): the nodes features, as code
- `node_is_attributed` (list: 1 x #nodes): ?
- `node_dfs_order` (list: #nodes x #1): the nodes order in the abstract tree, if parsed using a depth first search
- `node_depth` (list: #nodes x #1): the nodes depth in the abstract tree
- `y` (list: 1 x #tokens): contains the tokens to predict as method name
- `num_nodes` (int): number of nodes of the graph
- `ptr` (list: 2): index of first and last node of the graph
- `batch` (list: 1 x #nodes): ?
### Data Splits
This data comes from the PyGeometric version of the dataset provided by OGB, and follows the provided data splits.
This information can be found back using
```python
from ogb.graphproppred import PygGraphPropPredDataset
dataset = PygGraphPropPredDataset(name = 'ogbg-code2')
split_idx = dataset.get_idx_split()
train = dataset[split_idx['train']] # valid, test
```
More information (`node_feat_expanded`) has been added through the typeidx2type and attridx2attr csv files of the repo.
## Additional Information
### Licensing Information
The dataset has been released under MIT license license.
### Citation Information
```
@inproceedings{hu-etal-2020-open,
author = {Weihua Hu and
Matthias Fey and
Marinka Zitnik and
Yuxiao Dong and
Hongyu Ren and
Bowen Liu and
Michele Catasta and
Jure Leskovec},
editor = {Hugo Larochelle and
Marc Aurelio Ranzato and
Raia Hadsell and
Maria{-}Florina Balcan and
Hsuan{-}Tien Lin},
title = {Open Graph Benchmark: Datasets for Machine Learning on Graphs},
booktitle = {Advances in Neural Information Processing Systems 33: Annual Conference
on Neural Information Processing Systems 2020, NeurIPS 2020, December
6-12, 2020, virtual},
year = {2020},
url = {https://proceedings.neurips.cc/paper/2020/hash/fb60d411a5c5b72b2e7d3527cfc84fd0-Abstract.html},
}
```
### Contributions
Thanks to [@clefourrier](https://github.com/clefourrier) for adding this dataset. |
irds/mmarco_v2_de | ---
pretty_name: '`mmarco/v2/de`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/de`
The `mmarco/v2/de` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/de).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`mmarco_v2_de_dev`](https://huggingface.co/datasets/irds/mmarco_v2_de_dev), [`mmarco_v2_de_train`](https://huggingface.co/datasets/irds/mmarco_v2_de_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mmarco_v2_de', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
JohnTeddy3/text2image-multi-prompt | ---
license: apache-2.0
language:
- en
multilinguality:
- monolingual
pretty_name: multi text2image prompts a dataset collection
source_datasets:
- bartman081523/stable-diffusion-discord-prompts
- succinctly/midjourney-prompts
- Gustavosta/Stable-Diffusion-Prompts
tags:
- text generation
---
###转载 pszemraj/text2image-multi-prompt
# text2image multi-prompt(s): a dataset collection
- collection of several text2image prompt datasets
- data was cleaned/normalized with the goal of removing "model specific APIs" like the "--ar" for Midjourney and so on
- data de-duplicated on a basic level: exactly duplicate prompts were dropped (_after cleaning and normalization_)
## contents
```
DatasetDict({
train: Dataset({
features: ['text', 'src_dataset'],
num_rows: 3551734
})
test: Dataset({
features: ['text', 'src_dataset'],
num_rows: 399393
})
})
```
_NOTE: as the other two datasets did not have a `validation` split, the validation split of `succinctly/midjourney-prompts` was merged into `train`._ |
eduagarcia/CrawlPT_dedup | ---
language:
- pt
size_categories:
- 10M<n<100M
task_categories:
- text-generation
pretty_name: CrawlPT (deduplicated)
dataset_info:
- config_name: OSCAR-2301
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: categories
sequence: string
- name: dedup
struct:
- name: exact_norm
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: exact_hash_idx
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash_idx
dtype: int64
- name: harmful_pp
dtype: float64
- name: identification
struct:
- name: label
dtype: string
- name: prob
dtype: float64
- name: quality_warnings
sequence: string
- name: sentence_identifications
list:
- name: label
dtype: string
- name: prob
dtype: float64
- name: tlsh
dtype: string
- name: warc_headers
struct:
- name: content-length
dtype: int64
- name: content-type
dtype: string
- name: warc-block-digest
dtype: string
- name: warc-date
dtype: string
- name: warc-identified-content-language
dtype: string
- name: warc-record-id
dtype: string
- name: warc-refers-to
dtype: string
- name: warc-target-uri
dtype: string
- name: warc-type
dtype: string
splits:
- name: train
num_bytes: 77259995670.30853
num_examples: 10888966
download_size: 42589347661
dataset_size: 77259995670.30853
- config_name: all
features:
- name: id
dtype: int64
- name: source
dtype: string
- name: orig_id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 133074727589
num_examples: 52462533
download_size: 81483949567
dataset_size: 133074727589
- config_name: brwac
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: dedup
struct:
- name: exact_norm
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: exact_hash_idx
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash_idx
dtype: int64
- name: doc_id
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
splits:
- name: train
num_bytes: 18218935459.169613
num_examples: 3513588
download_size: 11210909325
dataset_size: 18218935459.169613
- config_name: cc100
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: dedup
struct:
- name: exact_norm
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: exact_hash_idx
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash
struct:
- name: cluster_main_idx
dtype: int64
- name: cluster_size
dtype: int64
- name: is_duplicate
dtype: bool
- name: minhash_idx
dtype: int64
splits:
- name: train
num_bytes: 53707749127.11777
num_examples: 38059979
download_size: 34844109320
dataset_size: 53707749127.11777
configs:
- config_name: OSCAR-2301
data_files:
- split: train
path: OSCAR-2301/train-*
- config_name: all
data_files:
- split: train
path: all/train-*
- config_name: brwac
data_files:
- split: train
path: brwac/train-*
- config_name: cc100
data_files:
- split: train
path: cc100/train-*
---
# CrawlPT (deduplicated)
CrawlPT is a generic Portuguese corpus extracted from various web pages.
This version is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).
The raw version is also available [here](https://huggingface.co/datasets/eduagarcia/CrawlPT).
## Dataset Details
Dataset is composed by three corpora:
[brWaC](https://aclanthology.org/L18-1686/), [C100-PT](https://arxiv.org/abs/1911.02116), [OSCAR-2301](http://arxiv.org/abs/2201.06642).
- **brWaC**: a web corpus for Brazilian Portuguese from 120,000 different websites.
- **C100-PT**: Portuguese subset from CC-100. C100 was created for training the multilingual Transformer XLM-R, containing two terabytes of cleaned data from 2018 snapshots of the [Common Crawl project](\url{https://commoncrawl.org/about/) in 100 languages. We use the , which contains 49.1 GiB of text.
- **OSCAR-2301-PT**: curation from OSCAR-2301 in the Portuguese language.
### Dataset Description
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
- **Paper:** https://aclanthology.org/2024.propor-1.38/
## Data Collection and Processing
Raw corpora sizes in terms of billions of tokens and file size in GiB:
| Corpus | Domain | Tokens (B) | Size (GiB) |
|-----------------|:-------:|:----------:|:----------:|
| brWaC | General | 2.7 | 16.3 |
| CC100 (PT) | General | 8.4 | 49.1 |
| OSCAR-2301 (PT) | General | 18.1 | 97.8 |
CrawlPT is deduplicated using [MinHash algorithm](https://dl.acm.org/doi/abs/10.5555/647819.736184) and [Locality Sensitive Hashing](https://dspace.mit.edu/bitstream/handle/1721.1/134231/v008a014.pdf?sequence=2&isAllowed=y), following the approach of [Lee et al. (2022)](http://arxiv.org/abs/2107.06499).
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.
Deduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:
| Corpus | Documents | Docs. after deduplicatio} | Duplicates (%) |
|------------------------|:----------:|:-------------------------:|:--------------:|
| brWaC | 3,530,796 | 3,513,588 | 0.49 |
| OSCAR-2301 (PT Subset) | 18,031,400 | 10,888,966 | 39.61 |
| CC100 (PT Subset) | 38,999,388 | 38,059,979 | 2.41 |
| Total (CrawlPT) | 60,561,584 | 52,462,533 | 13.37 |
## Citation
```bibtex
@inproceedings{garcia-etal-2024-robertalexpt,
title = "{R}o{BERT}a{L}ex{PT}: A Legal {R}o{BERT}a Model pretrained with deduplication for {P}ortuguese",
author = "Garcia, Eduardo A. S. and
Silva, Nadia F. F. and
Siqueira, Felipe and
Albuquerque, Hidelberg O. and
Gomes, Juliana R. S. and
Souza, Ellen and
Lima, Eliomar A.",
editor = "Gamallo, Pablo and
Claro, Daniela and
Teixeira, Ant{\'o}nio and
Real, Livy and
Garcia, Marcos and
Oliveira, Hugo Gon{\c{c}}alo and
Amaro, Raquel",
booktitle = "Proceedings of the 16th International Conference on Computational Processing of Portuguese",
month = mar,
year = "2024",
address = "Santiago de Compostela, Galicia/Spain",
publisher = "Association for Computational Lingustics",
url = "https://aclanthology.org/2024.propor-1.38",
pages = "374--383",
}
```
## Acknowledgment
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG). |
rajendrabaskota/gan-test-dataset | ---
dataset_info:
features:
- name: file_path
dtype: string
- name: label
dtype: int64
- name: img_embed
sequence: float64
splits:
- name: test
num_bytes: 542692009
num_examples: 87687
download_size: 441523966
dataset_size: 542692009
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
zolak/twitter_dataset_50_1713034044 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6800795
num_examples: 16689
download_size: 3402685
dataset_size: 6800795
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wbxlala/Dreamer_Arousal_shuffled | ---
dataset_info:
features:
- name: image
sequence:
sequence:
sequence: float64
- name: label
dtype: float64
splits:
- name: train
num_bytes: 499671504.0
num_examples: 414
download_size: 492836658
dataset_size: 499671504.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_62_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 14252711
num_examples: 8347
download_size: 7268998
dataset_size: 14252711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_62_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RaviNaik/C4-Kn | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: timestamp[s]
- name: url
dtype: string
splits:
- name: train
num_bytes: 7772502793
num_examples: 1056849
- name: validation
num_bytes: 7579027
num_examples: 1039
download_size: 3033462453
dataset_size: 7780081820
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
This is a filtered version of the [C4](https://huggingface.co/datasets/allenai/c4) dataset only containing samples of Kannada language.
The dataset contains total of 1056849 training and 1039 validation samples.
### Data Sample:
```python
{'text': 'ಹಳ್ಳಿಯ ‘ಬೋಲ್ಟ್\u200c’ಗಳನ್ನು ಗುರುತಿಸಿ | Prajavani\nಪ್ರಜಾವಾಣಿ ವಾರ್ತೆ Updated: 18 ಫೆಬ್ರವರಿ 2020, 01:30 IST\nಉಡುಪಿಯ ಐಕಳದಲ್ಲಿ ಇತ್ತೀಚೆಗೆ ನಡೆದ ಕಂಬಳದ ಓಟದಲ್ಲಿ ಶ್ರೀನಿವಾಸ ಗೌಡ ಎಂಬುವರು ವಿಶ್ವದ ವೇಗದ ಓಟಗಾರ ಉಸೇನ್ ಬೋಲ್ಟ್ ಅವರಿಗಿಂತಲೂ ವೇಗವಾಗಿ ಓಡಿ ಗುರಿ ತಲುಪಿದ್ದು, ಸಾರ್ವಜನಿಕರ ಮೆಚ್ಚುಗೆಗೆ ಪಾತ್ರರಾಗಿದ್ದಾರೆ. ಗ್ರಾಮೀಣ ಪ್ರದೇಶ\nಗಳಲ್ಲಿ ಇರುವ ಇಂತಹ ಓಟಗಾರರು ಮತ್ತು ಆಟಗಾರರು ಎಲೆಮರೆಯ ಕಾಯಿಯಂತೆ ತಮ್ಮ ಪಾಡಿಗೆ ತಾವು ಬೆಳೆಯುತ್ತಿರುತ್ತಾರೆ. ಶಾಲಾ- ಕಾಲೇಜುಗಳಲ್ಲಿ ಓದುತ್ತಿರುವವರಿಗೆ ಮುಂದೆ ಬರಲು ಸ್ವಲ್ಪಮಟ್ಟಿಗಾದರೂ ಅವಕಾಶ ಇರುತ್ತದೆ. ಅವಿದ್ಯಾವಂತರಿಗೆ ಅದೂ ಇಲ್ಲ.\nಇನ್ನು ಕ್ರೀಡಾಕೂಟಗಳಿಗೆ.......',
'timestamp': datetime.datetime(2020, 4, 1, 16, 50, 10),
'url': 'https://www.prajavani.net/op-ed/readers-letter/need-more-publicity-to-kambala-sports-706114.html'}
```
### Use with Datasets:
```python
from datasets import load_dataset
ds = load_dataset("RaviNaik/C4-Kn")
```
|
huggingface/autotrain-data-autotrain-g8rnq-78gb0-1 | Invalid username or password. |
open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa | ---
pretty_name: Evaluation run of yeontaek/Platypus2-13B-QLoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/Platypus2-13B-QLoRa](https://huggingface.co/yeontaek/Platypus2-13B-QLoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T02:07:21.128388](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa/blob/main/results_2023-10-22T02-07-21.128388.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007445469798657718,\n\
\ \"em_stderr\": 0.0008803652515899919,\n \"f1\": 0.06792785234899322,\n\
\ \"f1_stderr\": 0.001576095719649218,\n \"acc\": 0.4082075883226931,\n\
\ \"acc_stderr\": 0.008948818415880626\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007445469798657718,\n \"em_stderr\": 0.0008803652515899919,\n\
\ \"f1\": 0.06792785234899322,\n \"f1_stderr\": 0.001576095719649218\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.050037907505686124,\n \
\ \"acc_stderr\": 0.006005442354577729\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/yeontaek/Platypus2-13B-QLoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|arc:challenge|25_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T02_07_21.128388
path:
- '**/details_harness|drop|3_2023-10-22T02-07-21.128388.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T02-07-21.128388.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T02_07_21.128388
path:
- '**/details_harness|gsm8k|5_2023-10-22T02-07-21.128388.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T02-07-21.128388.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hellaswag|10_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T03:06:05.909035.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T03:06:05.909035.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T02_07_21.128388
path:
- '**/details_harness|winogrande|5_2023-10-22T02-07-21.128388.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T02-07-21.128388.parquet'
- config_name: results
data_files:
- split: 2023_08_18T03_06_05.909035
path:
- results_2023-08-18T03:06:05.909035.parquet
- split: 2023_10_22T02_07_21.128388
path:
- results_2023-10-22T02-07-21.128388.parquet
- split: latest
path:
- results_2023-10-22T02-07-21.128388.parquet
---
# Dataset Card for Evaluation run of yeontaek/Platypus2-13B-QLoRa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/Platypus2-13B-QLoRa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/Platypus2-13B-QLoRa](https://huggingface.co/yeontaek/Platypus2-13B-QLoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T02:07:21.128388](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__Platypus2-13B-QLoRa/blob/main/results_2023-10-22T02-07-21.128388.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007445469798657718,
"em_stderr": 0.0008803652515899919,
"f1": 0.06792785234899322,
"f1_stderr": 0.001576095719649218,
"acc": 0.4082075883226931,
"acc_stderr": 0.008948818415880626
},
"harness|drop|3": {
"em": 0.007445469798657718,
"em_stderr": 0.0008803652515899919,
"f1": 0.06792785234899322,
"f1_stderr": 0.001576095719649218
},
"harness|gsm8k|5": {
"acc": 0.050037907505686124,
"acc_stderr": 0.006005442354577729
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Jumtra/for_finetune_mpt7b_v6 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 190709417.20501885
num_examples: 153700
- name: test
num_bytes: 10037990.794981148
num_examples: 8090
download_size: 103293572
dataset_size: 200747408.0
---
# Dataset Card for "for_finetune_mpt7b_v6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indiejoseph/c4-cantonese-filtered | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2304453
num_examples: 21558
download_size: 1820474
dataset_size: 2304453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c4-cantonese-filterd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lexaizero/AingMaungArrghhhAingSiaMAungEtahSaha | ---
license: mit
---
|
Devio/books4_epub | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 17551774022.22586
num_examples: 1540371
download_size: 12937475460
dataset_size: 17551774022.22586
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fathyshalab/massive_qa | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 64967
num_examples: 1183
- name: validation
num_bytes: 11778
num_examples: 214
- name: test
num_bytes: 15940
num_examples: 288
download_size: 54118
dataset_size: 92685
---
# Dataset Card for "massive_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/reiuji_utsuho_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of reiuji_utsuho/霊烏路空/레이우지우츠호 (Touhou)
This is the dataset of reiuji_utsuho/霊烏路空/레이우지우츠호 (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, bow, hair_bow, green_bow, wings, red_eyes, third_eye, black_hair, black_wings, brown_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 680.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reiuji_utsuho_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 425.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reiuji_utsuho_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1119 | 784.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reiuji_utsuho_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 621.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reiuji_utsuho_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1119 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/reiuji_utsuho_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/reiuji_utsuho_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, arm_cannon, cape, green_skirt, open_mouth, short_sleeves, solo, shirt, smile, blush |
| 1 | 6 |  |  |  |  |  | 1girl, arm_cannon, cape, green_skirt, solo, grin |
| 2 | 16 |  |  |  |  |  | 1girl, arm_cannon, black_thighhighs, cape, green_skirt, solo, smile, zettai_ryouiki |
| 3 | 6 |  |  |  |  |  | 1girl, arm_cannon, cape, green_skirt, mismatched_footwear, smile, solo, sun |
| 4 | 5 |  |  |  |  |  | 1girl, arm_cannon, black_thighhighs, cape, green_skirt, looking_at_viewer, shirt, solo, zettai_ryouiki, puffy_short_sleeves, bird_wings, open_mouth, white_background |
| 5 | 8 |  |  |  |  |  | 1girl, arm_cannon, bangs, bird_wings, closed_mouth, collared_shirt, green_skirt, looking_at_viewer, solo, starry_sky_print, white_cape, white_shirt, black_socks, frilled_skirt, full_body, kneehighs, mismatched_footwear, puffy_short_sleeves, frilled_shirt_collar, smile, feathered_wings, simple_background, single_shoe, white_background, buttons, hair_between_eyes, black_footwear, brown_footwear, very_long_hair |
| 6 | 5 |  |  |  |  |  | 1girl, arm_cannon, bird_wings, black_socks, collared_shirt, feathered_wings, frilled_shirt_collar, frilled_skirt, green_skirt, looking_at_viewer, puffy_short_sleeves, solo, starry_sky_print, white_cape, white_shirt, bangs, kneehighs, blouse, open_mouth, shoes, feet_out_of_frame, foot_out_of_frame, medium_breasts, very_long_hair |
| 7 | 6 |  |  |  |  |  | 1girl, arm_cannon, bangs, bird_wings, collared_shirt, green_skirt, puffy_short_sleeves, solo, white_cape, white_shirt, closed_mouth, feathered_wings, hair_between_eyes, looking_at_viewer, smile, center_frills, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | arm_cannon | cape | green_skirt | open_mouth | short_sleeves | solo | shirt | smile | blush | grin | black_thighhighs | zettai_ryouiki | mismatched_footwear | sun | looking_at_viewer | puffy_short_sleeves | bird_wings | white_background | bangs | closed_mouth | collared_shirt | starry_sky_print | white_cape | white_shirt | black_socks | frilled_skirt | full_body | kneehighs | frilled_shirt_collar | feathered_wings | simple_background | single_shoe | buttons | hair_between_eyes | black_footwear | brown_footwear | very_long_hair | blouse | shoes | feet_out_of_frame | foot_out_of_frame | medium_breasts | center_frills | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-------|:--------------|:-------------|:----------------|:-------|:--------|:--------|:--------|:-------|:-------------------|:-----------------|:----------------------|:------|:--------------------|:----------------------|:-------------|:-------------------|:--------|:---------------|:-----------------|:-------------------|:-------------|:--------------|:--------------|:----------------|:------------|:------------|:-----------------------|:------------------|:--------------------|:--------------|:----------|:--------------------|:-----------------|:-----------------|:-----------------|:---------|:--------|:--------------------|:--------------------|:-----------------|:----------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | X | X | X | | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | | | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | | | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | X | | | X | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | X | X | | X | | | | | | | | | X | X | X | | X | | X | X | X | X | X | X | | X | X | X | | | | | | | X | X | X | X | X | X | | |
| 7 | 6 |  |  |  |  |  | X | X | | X | | | X | | X | | | | | | | X | X | X | | X | X | X | | X | X | | | | | | X | | | | X | | | | | | | | | X | X |
|
Thi4gomn/Voz_do_rascal2 | ---
license: openrail
---
|
quipohealth/d | ---
license: mit
---
|
open-llm-leaderboard/details_TheBloke__tulu-30B-fp16 | ---
pretty_name: Evaluation run of TheBloke/tulu-30B-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/tulu-30B-fp16](https://huggingface.co/TheBloke/tulu-30B-fp16) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__tulu-30B-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T14:05:44.356727](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__tulu-30B-fp16/blob/main/results_2023-10-22T14-05-44.356727.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4158976510067114,\n\
\ \"em_stderr\": 0.005047512015363023,\n \"f1\": 0.4501331795302018,\n\
\ \"f1_stderr\": 0.004938014903871411,\n \"acc\": 0.5026636978936352,\n\
\ \"acc_stderr\": 0.011011615647480079\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.4158976510067114,\n \"em_stderr\": 0.005047512015363023,\n\
\ \"f1\": 0.4501331795302018,\n \"f1_stderr\": 0.004938014903871411\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19711902956785443,\n \
\ \"acc_stderr\": 0.01095802163030063\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/tulu-30B-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T14_05_44.356727
path:
- '**/details_harness|drop|3_2023-10-22T14-05-44.356727.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T14-05-44.356727.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T14_05_44.356727
path:
- '**/details_harness|gsm8k|5_2023-10-22T14-05-44.356727.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T14-05-44.356727.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T14_05_44.356727
path:
- '**/details_harness|winogrande|5_2023-10-22T14-05-44.356727.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T14-05-44.356727.parquet'
- config_name: results
data_files:
- split: 2023_10_22T14_05_44.356727
path:
- results_2023-10-22T14-05-44.356727.parquet
- split: latest
path:
- results_2023-10-22T14-05-44.356727.parquet
---
# Dataset Card for Evaluation run of TheBloke/tulu-30B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/tulu-30B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/tulu-30B-fp16](https://huggingface.co/TheBloke/tulu-30B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__tulu-30B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T14:05:44.356727](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__tulu-30B-fp16/blob/main/results_2023-10-22T14-05-44.356727.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4158976510067114,
"em_stderr": 0.005047512015363023,
"f1": 0.4501331795302018,
"f1_stderr": 0.004938014903871411,
"acc": 0.5026636978936352,
"acc_stderr": 0.011011615647480079
},
"harness|drop|3": {
"em": 0.4158976510067114,
"em_stderr": 0.005047512015363023,
"f1": 0.4501331795302018,
"f1_stderr": 0.004938014903871411
},
"harness|gsm8k|5": {
"acc": 0.19711902956785443,
"acc_stderr": 0.01095802163030063
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dotan1111/MSA-nuc-10-seq | ---
tags:
- sequence-to-sequence
- bioinformatics
- biology
---
# Multiple Sequence Alignment as a Sequence-to-Sequence Learning Problem
## Abstract:
The sequence alignment problem is one of the most fundamental problems in bioinformatics and a plethora of methods were devised to tackle it. Here we introduce BetaAlign, a methodology for aligning sequences using an NLP approach. BetaAlign accounts for the possible variability of the evolutionary process among different datasets by using an ensemble of transformers, each trained on millions of samples generated from a different evolutionary model. Our approach leads to alignment accuracy that is similar and often better than commonly used methods, such as MAFFT, DIALIGN, ClustalW, T-Coffee, PRANK, and MUSCLE.

An illustration of aligning sequences with sequence-to-sequence learning. (a) Consider two input sequences "AAG" and "ACGG". (b) The result of encoding the unaligned sequences into the source language (*Concat* representation). (c) The sentence from the source language is translated to the target language via a transformer model. (d) The translated sentence in the target language (*Spaces* representation). (e) The resulting alignment, decoded from the translated sentence, in which "AA-G" is aligned to "ACGG". The transformer architecture illustration is adapted from (Vaswani et al., 2017).
## Data:
We used SpartaABC (Loewenthal et al., 2021) to generate millions of true alignments. SpartaABC requires the following input: (1) a rooted phylogenetic tree, which includes a topology and branch lengths; (2) a substitution model (amino acids or nucleotides); (3) root sequence length; (4) the indel model parameters, which include: insertion rate (*R_I*), deletion rate (*R_D*), a parameter for the insertion Zipfian distribution (*A_I*), and a parameter for the deletion Zipfian distribution (*A_D*). MSAs were simulated along random phylogenetic tree topologies generated using the program ETE version 3.0 (Huerta-Cepas et al., 2016) with default parameters.
We generated 1,495,000, 2,000 and 3,000, protein MSAs with ten sequences that were used as training validation and testing data, respectively. We generated the same number of DNA MSAs. For each random tree, branch lengths were drawn from a uniform distribution in the range *(0.5,1.0)*. Next, the sequences were generated using SpartaABC with the following parameters: *R_I,R_D \in (0.0,0.05)*, *A_I, A_D \in (1.01,2.0)*. The alignment lengths as well as the sequence lengths of the tree leaves vary within and among datasets as they depend on the indel dynamics and the root length. The root length was sampled uniformly in the range *[32,44]*. Unless stated otherwise, all protein datasets were generated with the WAG+G model, and all DNA datasets were generated with the GTR+G model, with the following parameters: (1) frequencies for the different nucleotides *(0.37, 0.166, 0.307, 0.158)*, in the order "T", "C", "A" and "G"; (2) with the substitutions rate *(0.444, 0.0843, 0.116, 0.107, 0.00027)*, in the order "a", "b", "c", "d", and "e" for the substitution matrix.
## Example:
The following example correspond for the illustrated MSA in the figure above:
{"MSA": "AAAC-GGG", "unaligned_seqs": {"seq0": "AAG", "seq1": "ACGG"}}
## APA
```
Dotan, E., Belinkov, Y., Avram, O., Wygoda, E., Ecker, N., Alburquerque, M., Keren, O., Loewenthal, G., & Pupko T. (2023). Multiple sequence alignment as a sequence-to-sequence learning problem. The Eleventh International Conference on Learning Representations (ICLR 2023).
```
## BibTeX
```
@article{Dotan_multiple_2023,
author = {Dotan, Edo and Belinkov, Yonatan and Avram, Oren and Wygoda, Elya and Ecker, Noa and Alburquerque, Michael and Keren, Omri and Loewenthal, Gil and Pupko, Tal},
month = aug,
title = {{Multiple sequence alignment as a sequence-to-sequence learning problem}},
year = {2023}
}
``` |
CyberHarem/grizzly_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of grizzly/グリズリー/灰熊MkⅤ (Girls' Frontline)
This is the dataset of grizzly/グリズリー/灰熊MkⅤ (Girls' Frontline), containing 273 images and their tags.
The core tags of this character are `purple_eyes, brown_hair, breasts, short_hair, large_breasts, sunglasses, eyewear_on_head, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 273 | 402.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grizzly_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 273 | 211.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grizzly_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 707 | 479.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grizzly_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 273 | 347.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grizzly_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 707 | 701.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/grizzly_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/grizzly_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, jacket, solo, handgun, holding_gun, white_background, belt, denim_shorts, white_shirt, aviator_sunglasses, black_gloves, looking_at_viewer, simple_background, single_thighhigh, black_thighhighs, full_body, holster, short_shorts, blue_shorts, brown_footwear, smile, standing |
| 1 | 11 |  |  |  |  |  | 1girl, jacket, solo, handgun, holding_gun, looking_at_viewer, aviator_sunglasses, smile, simple_background, white_shirt, black_gloves, upper_body, white_background, belt, shorts |
| 2 | 5 |  |  |  |  |  | 1girl, jacket, looking_at_viewer, navel, solo, gloves, smile, belt, blush, denim_shorts, short_shorts, aviator_sunglasses, black_bra, simple_background, white_background, white_shirt |
| 3 | 7 |  |  |  |  |  | 1girl, bear_ears, black_thighhighs, doughnut, jacket, official_alternate_costume, solo, aged_down, ahoge, garter_straps, looking_at_viewer, navel, belt, black_shirt, full_body, midriff, black_choker, black_footwear, blush, boots, crop_top, denim_shorts, short_shorts, simple_background, small_breasts, bear_girl, blue_shorts, collarbone, food_on_face, holding_food, short_sleeves, white_background |
| 4 | 15 |  |  |  |  |  | 1girl, official_alternate_costume, tank_top, choker, cleavage, navel, looking_at_viewer, solo, glasses, yellow_shorts, collarbone, red-framed_eyewear, short_shorts, belt, hair_ornament, thigh_strap, bag, burger, holding_food, midriff, short_ponytail, simple_background, smile, white_background, bracelet, full_body, medium_breasts, shoes, socks, watch |
| 5 | 8 |  |  |  |  |  | 1girl, blush, hetero, nipples, open_mouth, penis, sex, solo_focus, vaginal, 1boy, mosaic_censoring, spread_legs, sweat, cum_in_pussy, looking_at_viewer, smile, thighhighs, completely_nude, gloves, heart-shaped_pupils, lying, navel, straddling |
| 6 | 5 |  |  |  |  |  | 1girl, cleavage, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, solo, black_pantyhose, detached_collar, simple_background, black_leotard, bowtie, covered_navel, black_gloves, black_jacket, fur_trim, hand_on_hip, holding, open_jacket, smile, white_background, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jacket | solo | handgun | holding_gun | white_background | belt | denim_shorts | white_shirt | aviator_sunglasses | black_gloves | looking_at_viewer | simple_background | single_thighhigh | black_thighhighs | full_body | holster | short_shorts | blue_shorts | brown_footwear | smile | standing | upper_body | shorts | navel | gloves | blush | black_bra | bear_ears | doughnut | official_alternate_costume | aged_down | ahoge | garter_straps | black_shirt | midriff | black_choker | black_footwear | boots | crop_top | small_breasts | bear_girl | collarbone | food_on_face | holding_food | short_sleeves | tank_top | choker | cleavage | glasses | yellow_shorts | red-framed_eyewear | hair_ornament | thigh_strap | bag | burger | short_ponytail | bracelet | medium_breasts | shoes | socks | watch | hetero | nipples | open_mouth | penis | sex | solo_focus | vaginal | 1boy | mosaic_censoring | spread_legs | sweat | cum_in_pussy | thighhighs | completely_nude | heart-shaped_pupils | lying | straddling | fake_animal_ears | playboy_bunny | rabbit_ears | black_pantyhose | detached_collar | black_leotard | bowtie | covered_navel | black_jacket | fur_trim | hand_on_hip | holding | open_jacket | wrist_cuffs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:----------|:--------------|:-------------------|:-------|:---------------|:--------------|:---------------------|:---------------|:--------------------|:--------------------|:-------------------|:-------------------|:------------|:----------|:---------------|:--------------|:-----------------|:--------|:-----------|:-------------|:---------|:--------|:---------|:--------|:------------|:------------|:-----------|:-----------------------------|:------------|:--------|:----------------|:--------------|:----------|:---------------|:-----------------|:--------|:-----------|:----------------|:------------|:-------------|:---------------|:---------------|:----------------|:-----------|:---------|:-----------|:----------|:----------------|:---------------------|:----------------|:--------------|:------|:---------|:-----------------|:-----------|:-----------------|:--------|:--------|:--------|:---------|:----------|:-------------|:--------|:------|:-------------|:----------|:-------|:-------------------|:--------------|:--------|:---------------|:-------------|:------------------|:----------------------|:--------|:-------------|:-------------------|:----------------|:--------------|:------------------|:------------------|:----------------|:---------|:----------------|:---------------|:-----------|:--------------|:----------|:--------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | | X | X | X | X | X | | X | X | | | | | X | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | | | X | X | X | | | | X | X | | X | X | | X | X | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | | X | | | X | X | | | | | X | X | | | X | | X | | | X | | | | X | | | | | | X | | | | | X | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | | | | | | | | X | | | | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | | | X | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Gabriel1322/lucas | ---
license: openrail
---
|
anan-2024/twitter_dataset_1713127117 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 175783
num_examples: 481
download_size: 101123
dataset_size: 175783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Gille__StrangeMerges_20-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_20-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_20-7B-slerp](https://huggingface.co/Gille/StrangeMerges_20-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_20-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T23:29:40.441351](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_20-7B-slerp/blob/main/results_2024-04-02T23-29-40.441351.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6568209906863018,\n\
\ \"acc_stderr\": 0.03197360214037413,\n \"acc_norm\": 0.655947230300773,\n\
\ \"acc_norm_stderr\": 0.03264361824443875,\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.01734120239498833,\n \"mc2\": 0.7090103408223626,\n\
\ \"mc2_stderr\": 0.014840595946247998\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520767,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710698\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7149970125473013,\n\
\ \"acc_stderr\": 0.004504932999736407,\n \"acc_norm\": 0.8844851623182632,\n\
\ \"acc_norm_stderr\": 0.003189889789404672\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.016578997435496713,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.016578997435496713\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.01734120239498833,\n \"mc2\": 0.7090103408223626,\n\
\ \"mc2_stderr\": 0.014840595946247998\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370625\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7217589082638363,\n \
\ \"acc_stderr\": 0.012343803671422677\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_20-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|arc:challenge|25_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|gsm8k|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hellaswag|10_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T23-29-40.441351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T23-29-40.441351.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- '**/details_harness|winogrande|5_2024-04-02T23-29-40.441351.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T23-29-40.441351.parquet'
- config_name: results
data_files:
- split: 2024_04_02T23_29_40.441351
path:
- results_2024-04-02T23-29-40.441351.parquet
- split: latest
path:
- results_2024-04-02T23-29-40.441351.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_20-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_20-7B-slerp](https://huggingface.co/Gille/StrangeMerges_20-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_20-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T23:29:40.441351](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_20-7B-slerp/blob/main/results_2024-04-02T23-29-40.441351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6568209906863018,
"acc_stderr": 0.03197360214037413,
"acc_norm": 0.655947230300773,
"acc_norm_stderr": 0.03264361824443875,
"mc1": 0.5679314565483476,
"mc1_stderr": 0.01734120239498833,
"mc2": 0.7090103408223626,
"mc2_stderr": 0.014840595946247998
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520767,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710698
},
"harness|hellaswag|10": {
"acc": 0.7149970125473013,
"acc_stderr": 0.004504932999736407,
"acc_norm": 0.8844851623182632,
"acc_norm_stderr": 0.003189889789404672
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496713,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496713
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5679314565483476,
"mc1_stderr": 0.01734120239498833,
"mc2": 0.7090103408223626,
"mc2_stderr": 0.014840595946247998
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370625
},
"harness|gsm8k|5": {
"acc": 0.7217589082638363,
"acc_stderr": 0.012343803671422677
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vibhorag101/phr-mental-therapy-dataset-conversational-format | ---
dataset_info:
features:
- name: id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 326215522
num_examples: 69360
- name: test
num_bytes: 69767441
num_examples: 14863
- name: val
num_bytes: 69987633
num_examples: 14863
download_size: 213309700
dataset_size: 465970596
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
JDaniel423/running-records-errors-dataset | ---
license: cc-by-4.0
task_categories:
- token-classification
language:
- en
size_categories:
- 100K<n<1M
tags:
- education
dataset_info:
features:
- name: audio_path
dtype: string
- name: asr_transcript
dtype: string
- name: original_text
dtype: string
- name: mutated_text
dtype: string
- name: index_tags
dtype: string
- name: mutated_tags
dtype: string
splits:
- name: DEL
num_bytes: 208676326
num_examples: 351867
- name: SUB
num_bytes: 243003228
num_examples: 351867
- name: REP
num_bytes: 303304320
num_examples: 351867
download_size: 0
dataset_size: 754983874
---
# Dataset Card for Running Records Errors Dataset
## Dataset Description
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The Running Records Errors dataset is an English-language dataset containing 1,055,601 sentences based on the Europarl corpus. As described in our paper,
we take the sentences from the English version of the Europarl corpus and randomly inject three types of errors into the sentences: *repetitions*, where
certain words or phrases are repeated, *substitutions*, where certain words are replaced with a different word, and *deletions*, where the word is completely
omitted. The sentences are then passed into a TTS pipeline consisting of TacoTron2 and HifiGAN model to produce audio recordings of those mutated sentences. Lastly,
the data is passed into a Quartznet 15x5 model which produces a transcript of the spoken audio.
### Supported Tasks and Leaderboards
The original purpose of this dataset was to construct a model pipeline that could score running records assesments given a transcript of a child's speech along with
the true text for that assesment. However, we provide this dataset to support other tasks involving error detection in text.
### Languages
All of the data in the dataset is in English.
## Dataset Structure
### Data Instances
For each instance, there is a string for the audio transcript, a string for the original text before we added any errors, as well as a string of the sentence with the errors we generated.
In addition, we provide two lists. One list denotes the original position of each word in the mutated text, and the second list denotes the error applied to that word.
### Data Fields
- asr_transcript: The transcript of the audio processed by our Quartznet 15x5 model.
- original_text: The original text that was in the Europarl corupus. This text contains no artificial errors.
- mutated_text: This text contains the errors we injected.
- index_tags: This list denotes the original position of each word in `mutated_text.`
- mutated_tags: This list denotes the error applied to each word in `mutated_text.`
### Data Splits
- DEL: Sentences that have had random words removed.
- REP: Sentences that have had repetitions inserted.
- SUB: Sentences that have had words randomly substituted.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was generated with the guidance of Carl Ehrett. |
liuyanchen1015/MULTI_VALUE_mnli_for_complementizer | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 748206
num_examples: 3149
- name: dev_mismatched
num_bytes: 835660
num_examples: 3393
- name: test_matched
num_bytes: 733363
num_examples: 3036
- name: test_mismatched
num_bytes: 845923
num_examples: 3455
- name: train
num_bytes: 29607957
num_examples: 122902
download_size: 20812192
dataset_size: 32771109
---
# Dataset Card for "MULTI_VALUE_mnli_for_complementizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loubnabnl/sample_jupyter | ---
dataset_info:
features:
- name: path
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_event_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_language
dtype: string
- name: language
dtype: string
- name: is_generated
dtype: bool
- name: is_vendor
dtype: bool
- name: conversion_extension
dtype: string
- name: size
dtype: int64
- name: script
dtype: string
- name: script_size
dtype: int64
splits:
- name: train
num_bytes: 128801038
num_examples: 5000
download_size: 74725541
dataset_size: 128801038
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/serizawa_asahi_theidolmstershinycolors | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of serizawa_asahi/芹沢あさひ/세리자와아사히 (THE iDOLM@STER: SHINY COLORS)
This is the dataset of serizawa_asahi/芹沢あさひ/세리자와아사히 (THE iDOLM@STER: SHINY COLORS), containing 500 images and their tags.
The core tags of this character are `blue_eyes, bangs, short_hair, grey_hair, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 822.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serizawa_asahi_theidolmstershinycolors/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 415.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serizawa_asahi_theidolmstershinycolors/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1271 | 931.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serizawa_asahi_theidolmstershinycolors/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 701.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serizawa_asahi_theidolmstershinycolors/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1271 | 1.38 GiB | [Download](https://huggingface.co/datasets/CyberHarem/serizawa_asahi_theidolmstershinycolors/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/serizawa_asahi_theidolmstershinycolors',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, blush, looking_at_viewer, small_breasts, solo, collarbone, simple_background, white_background, smile, short_twintails, black_one-piece_swimsuit, thighs, cowboy_shot, open_mouth, twin_braids, ass |
| 1 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, long_sleeves, simple_background, white_background, blush, smile, white_shirt, closed_mouth, upper_body, open_jacket, purple_jacket |
| 2 | 13 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, solo, purple_jacket, simple_background, bike_shorts, fanny_pack, white_shirt, open_jacket, smile, sneakers, white_background, black_shorts, blush, full_body, open_mouth, sleeves_past_wrists, white_socks, upper_teeth_only |
| 3 | 5 |  |  |  |  |  | 1girl, bike_shorts, black_shorts, goggles_around_neck, long_sleeves, looking_at_viewer, sailor_collar, solo, blue_jacket, inline_skates, open_jacket, blush, grin, petals, white_shirt, blue_neckerchief, breasts, window |
| 4 | 6 |  |  |  |  |  | 1girl, sailor_collar, serafuku, simple_background, solo, white_background, jewelry, looking_at_viewer, upper_body, cardigan, long_sleeves, sleeves_past_wrists |
| 5 | 23 |  |  |  |  |  | 1girl, looking_at_viewer, cardigan, long_sleeves, serafuku, solo, plaid_skirt, blush, jewelry, pleated_skirt, simple_background, blue_skirt, blue_sailor_collar, white_background, open_mouth, :d |
| 6 | 5 |  |  |  |  |  | 1girl, bowtie, long_hair, looking_at_viewer, plaid_skirt, pleated_skirt, school_uniform, simple_background, solo, white_background, white_shirt, blue_skirt, blush, bracelet, kogal, school_bag, jacket_around_waist, long_sleeves, nail_polish, wrist_scrunchie, hairclip, holding, wavy_hair |
| 7 | 5 |  |  |  |  |  | 1girl, kogal, long_hair, looking_at_viewer, nail_polish, school_uniform, simple_background, solo, white_shirt, wrist_scrunchie, alternate_hair_length, blue_nails, blush, hairclip, jacket_around_waist, open_collar, plaid_skirt, wavy_hair, white_background, bag, blue_skirt, bracelet, grin, loose_bowtie, loose_socks, pleated_skirt, polka_dot, upper_body, white_socks |
| 8 | 10 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, bar_censor, fellatio, solo_focus, :>=, cardigan, long_sleeves, looking_at_viewer, pov, serafuku, male_pubic_hair, blue_sailor_collar, blue_skirt, clothed_female_nude_male, cum, heart-shaped_pupils, jewelry, plaid_skirt |
| 9 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, two_side_up, blush, long_sleeves, stuffed_bunny, stuffed_cat, closed_mouth, object_hug, pink_jacket, teddy_bear, :t, pout, black_shorts, brown_hair, short_shorts, sitting, sleeves_past_wrists, blurry_background, jewelry, knees_up, pink_footwear, shirt, shoes, simple_background, white_background |
| 10 | 6 |  |  |  |  |  | 1girl, collarbone, looking_at_viewer, navel, small_breasts, stomach, white_bra, white_panties, parted_lips, simple_background, solo, underwear_only, white_background, blush |
| 11 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, nail_polish, detached_sleeves, long_sleeves, red_nails, solo, bare_shoulders, red_skirt, smile, belt, hair_ribbon, pantyhose, red_ribbon, open_mouth, pleated_skirt, simple_background, black_shirt, blush, hair_between_eyes, jacket, miniskirt, navel, streaked_hair, white_background |
| 12 | 9 |  |  |  |  |  | 1girl, crown, jewelry, solo, white_gloves, frilled_sleeves, looking_at_viewer, fur_trim, smile, cape, epaulettes, long_sleeves, white_shorts, belt, simple_background, sitting, thigh_strap, white_footwear, white_shirt |
| 13 | 41 |  |  |  |  |  | 1girl, maid_apron, solo, maid_headdress, black_dress, looking_at_viewer, enmaided, frills, juliet_sleeves, blush, smile, simple_background, single_hair_bun, holding, open_mouth, white_background, white_apron |
| 14 | 5 |  |  |  |  |  | 1girl, braid, christmas, looking_at_viewer, santa_costume, santa_hat, smile, solo, blush, red_headwear, sack, star_(symbol), holding, long_sleeves, open_mouth, simple_background, white_background, belt, blurry_background, capelet, depth_of_field, fur-trimmed_headwear, fur-trimmed_shorts, fur-trimmed_sleeves, jewelry, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | small_breasts | solo | collarbone | simple_background | white_background | smile | short_twintails | black_one-piece_swimsuit | thighs | cowboy_shot | open_mouth | twin_braids | ass | long_sleeves | white_shirt | closed_mouth | upper_body | open_jacket | purple_jacket | bike_shorts | fanny_pack | sneakers | black_shorts | full_body | sleeves_past_wrists | white_socks | upper_teeth_only | goggles_around_neck | sailor_collar | blue_jacket | inline_skates | grin | petals | blue_neckerchief | breasts | window | serafuku | jewelry | cardigan | plaid_skirt | pleated_skirt | blue_skirt | blue_sailor_collar | :d | bowtie | long_hair | school_uniform | bracelet | kogal | school_bag | jacket_around_waist | nail_polish | wrist_scrunchie | hairclip | holding | wavy_hair | alternate_hair_length | blue_nails | open_collar | bag | loose_bowtie | loose_socks | polka_dot | 1boy | hetero | penis | bar_censor | fellatio | solo_focus | :>= | pov | male_pubic_hair | clothed_female_nude_male | cum | heart-shaped_pupils | two_side_up | stuffed_bunny | stuffed_cat | object_hug | pink_jacket | teddy_bear | :t | pout | brown_hair | short_shorts | sitting | blurry_background | knees_up | pink_footwear | shirt | shoes | navel | stomach | white_bra | white_panties | parted_lips | underwear_only | detached_sleeves | red_nails | bare_shoulders | red_skirt | belt | hair_ribbon | pantyhose | red_ribbon | black_shirt | hair_between_eyes | jacket | miniskirt | streaked_hair | crown | white_gloves | frilled_sleeves | fur_trim | cape | epaulettes | white_shorts | thigh_strap | white_footwear | maid_apron | maid_headdress | black_dress | enmaided | frills | juliet_sleeves | single_hair_bun | white_apron | braid | christmas | santa_costume | santa_hat | red_headwear | sack | star_(symbol) | capelet | depth_of_field | fur-trimmed_headwear | fur-trimmed_shorts | fur-trimmed_sleeves |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------------------|:----------------|:-------|:-------------|:--------------------|:-------------------|:--------|:------------------|:---------------------------|:---------|:--------------|:-------------|:--------------|:------|:---------------|:--------------|:---------------|:-------------|:--------------|:----------------|:--------------|:-------------|:-----------|:---------------|:------------|:----------------------|:--------------|:-------------------|:----------------------|:----------------|:--------------|:----------------|:-------|:---------|:-------------------|:----------|:---------|:-----------|:----------|:-----------|:--------------|:----------------|:-------------|:---------------------|:-----|:---------|:------------|:-----------------|:-----------|:--------|:-------------|:----------------------|:--------------|:------------------|:-----------|:----------|:------------|:------------------------|:-------------|:--------------|:------|:---------------|:--------------|:------------|:-------|:---------|:--------|:-------------|:-----------|:-------------|:------|:------|:------------------|:---------------------------|:------|:----------------------|:--------------|:----------------|:--------------|:-------------|:--------------|:-------------|:-----|:-------|:-------------|:---------------|:----------|:--------------------|:-----------|:----------------|:--------|:--------|:--------|:----------|:------------|:----------------|:--------------|:-----------------|:-------------------|:------------|:-----------------|:------------|:-------|:--------------|:------------|:-------------|:--------------|:--------------------|:---------|:------------|:----------------|:--------|:---------------|:------------------|:-----------|:-------|:-------------|:---------------|:--------------|:-----------------|:-------------|:-----------------|:--------------|:-----------|:---------|:-----------------|:------------------|:--------------|:--------|:------------|:----------------|:------------|:---------------|:-------|:----------------|:----------|:-----------------|:-----------------------|:---------------------|:----------------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | | X | | | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | X | X | | | X | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | | X | | X | X | | | | | | | | | X | | | X | | | | | | | | X | | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 23 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | | X | | X | | | | | | | | | X | | | | | | X | | | | | | | | X | X | X | | | | X | X | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 8 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | X | | X | | | | | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 8 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 9 |  |  |  |  |  | X | | X | | X | | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 13 | 41 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 14 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
pranjali97/ha-en_RL-grow1_valid_scorecompare | ---
dataset_info:
features:
- name: src
dtype: string
- name: ref
dtype: string
- name: mt
dtype: string
- name: score_da
dtype: float64
- name: score_rfree
dtype: float64
splits:
- name: train
num_bytes: 1579988
num_examples: 3339
download_size: 0
dataset_size: 1579988
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ha-en_RL-grow1_valid_scorecompare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kabilan108/spectrograms | ---
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- audio-classification
tags:
- music
- spectrogram
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': afrobeats
'1': rock
splits:
- name: train
num_bytes: 414773092.453
num_examples: 2033
download_size: 404353267
dataset_size: 414773092.453
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ksabeh/openbrand | ---
dataset_info:
features:
- name: category
dtype: string
- name: title
dtype: string
- name: brand
dtype: string
- name: asin
dtype: string
- name: imageURL
dtype: string
- name: position_index
dtype: int64
- name: num_tokens
dtype: int64
- name: title_length
dtype: int64
- name: title_category
dtype: string
splits:
- name: train
num_bytes: 68007488
num_examples: 181551
- name: test
num_bytes: 18875793
num_examples: 50432
- name: automotive
num_bytes: 4523220
num_examples: 12891
- name: cellphones
num_bytes: 51882096
num_examples: 78478
- name: clothes
num_bytes: 37489496
num_examples: 85052
- name: electronics
num_bytes: 4820108
num_examples: 9568
- name: grocery
num_bytes: 1567047
num_examples: 4475
- name: new_cat
num_bytes: 93547671
num_examples: 174381
- name: pets
num_bytes: 4175961
num_examples: 10851
- name: sports
num_bytes: 3804172
num_examples: 10841
- name: toys
num_bytes: 4161246
num_examples: 12657
- name: val
num_bytes: 7583420
num_examples: 20172
download_size: 110231234
dataset_size: 300437718
---
# Dataset Card for "openbrand"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Basirudin/aka_generic_ner | ---
license: apache-2.0
---
|
kejian/SciReviewGen | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: reference
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 1017206768
num_examples: 84705
- name: validation
num_bytes: 52660512
num_examples: 4410
- name: test
num_bytes: 54202617
num_examples: 4457
download_size: 507188880
dataset_size: 1124069897
---
# Dataset Card for "SciReviewGen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
peeper/vitmae-roberta-processed | ---
dataset_info:
features:
- name: labels
sequence: int64
- name: pixel_values
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 2567566872
num_examples: 4238
- name: test
num_bytes: 856057572
num_examples: 1413
download_size: 1000718544
dataset_size: 3423624444
---
# Dataset Card for "vitmae-roberta-processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.