datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
shokhjakhon/zakon_data | ---
license: cdla-permissive-1.0
---
|
venetis/symptom_text_to_disease_mk4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype:
class_label:
names:
'0': emotional pain
'1': hair falling out
'2': heart hurts
'3': infected wound
'4': foot ache
'5': shoulder pain
'6': injury from sports
'7': skin issue
'8': stomach ache
'9': knee pain
'10': joint pain
'11': hard to breath
'12': head ache
'13': body feels weak
'14': feeling dizzy
'15': back pain
'16': open wound
'17': internal pain
'18': blurry vision
'19': acne
'20': muscle pain
'21': neck pain
'22': cough
'23': ear ache
'24': feeling cold
splits:
- name: train
num_bytes: 330494.3762197868
num_examples: 5328
- name: test
num_bytes: 41373.82675273983
num_examples: 667
- name: valid
num_bytes: 41311.79702747335
num_examples: 666
download_size: 144224
dataset_size: 413180.0
---
# Dataset Card for "symptom_text_to_disease_mk4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_reduced_relative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 206462
num_examples: 458
- name: train
num_bytes: 199088
num_examples: 448
download_size: 269179
dataset_size: 405550
---
# Dataset Card for "MULTI_VALUE_rte_reduced_relative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yoruba_gv_ner | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- yo
license:
- cc-by-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: Yoruba GV NER Corpus
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
config_name: yoruba_gv_ner
splits:
- name: train
num_bytes: 358885
num_examples: 817
- name: validation
num_bytes: 50161
num_examples: 117
- name: test
num_bytes: 96518
num_examples: 237
download_size: 254347
dataset_size: 505564
---
# Dataset Card for Yoruba GV NER Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** [Yoruba GV NER](https://github.com/ajesujoba/YorubaTwi-Embedding/tree/master/Yoruba/Yoruba-NER)
- **Paper:** https://www.aclweb.org/anthology/2020.lrec-1.335/
- **Leaderboard:**
- **Point of Contact:** [David Adelani](mailto:didelani@lsv.uni-saarland.de)
### Dataset Summary
The Yoruba GV NER is a named entity recognition (NER) dataset for Yorùbá language based on the [Global Voices news](https://yo.globalvoices.org/) corpus. Global Voices (GV) is a multilingual news platform with articles contributed by journalists, translators, bloggers, and human rights activists from around the world with a coverage of over 50 languages. Most of the texts used in creating the Yoruba GV NER are translations from other languages to Yorùbá.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The language supported is Yorùbá.
## Dataset Structure
### Data Instances
A data point consists of sentences seperated by empty line and tab-seperated tokens and tags.
{'id': '0',
'ner_tags': [B-LOC, 0, 0, 0, 0],
'tokens': ['Tanzania', 'fi', 'Ajìjàgbara', 'Ọmọ', 'Orílẹ̀-èdèe']
}
### Data Fields
- `id`: id of the sample
- `tokens`: the tokens of the example text
- `ner_tags`: the NER tags of each token
The NER tags correspond to this list:
```
"O", "B-PER", "I-PER", "B-ORG", "I-ORG", "B-LOC", "I-LOC", "B-DATE", "I-DATE",
```
The NER tags have the same format as in the CoNLL shared task: a B denotes the first item of a phrase and an I any non-initial word. There are four types of phrases: person names (PER), organizations (ORG), locations (LOC) and dates & times (DATE). (O) is used for tokens not considered part of any named entity.
### Data Splits
Training (19,421 tokens), validation (2,695 tokens) and test split (5,235 tokens)
## Dataset Creation
### Curation Rationale
The data was created to help introduce resources to new language - Yorùbá.
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
The dataset is based on the news domain and was crawled from [Global Voices Yorùbá news](https://yo.globalvoices.org/).
[More Information Needed]
#### Who are the source language producers?
The dataset contributed by journalists, translators, bloggers, and human rights activists from around the world. Most of the texts used in creating the Yoruba GV NER are translations from other languages to Yorùbá
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
The data was annotated by Jesujoba Alabi and David Adelani for the paper:
[Massive vs. Curated Embeddings for Low-Resourced Languages: the case of Yorùbá and Twi](https://www.aclweb.org/anthology/2020.lrec-1.335/).
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The annotated data sets were developed by students of Saarland University, Saarbrücken, Germany .
### Licensing Information
The data is under the [Creative Commons Attribution 3.0 ](https://creativecommons.org/licenses/by/3.0/)
### Citation Information
```
@inproceedings{alabi-etal-2020-massive,
title = "Massive vs. Curated Embeddings for Low-Resourced Languages: the Case of {Y}or{\`u}b{\'a} and {T}wi",
author = "Alabi, Jesujoba and
Amponsah-Kaakyire, Kwabena and
Adelani, David and
Espa{\~n}a-Bonet, Cristina",
booktitle = "Proceedings of the 12th Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://www.aclweb.org/anthology/2020.lrec-1.335",
pages = "2754--2762",
language = "English",
ISBN = "979-10-95546-34-4",
}
```
### Contributions
Thanks to [@dadelani](https://github.com/dadelani) for adding this dataset. |
freshpearYoon/v3_train_free_10 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 15366798568
num_examples: 10000
download_size: 2083419270
dataset_size: 15366798568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dltdojo/ecommerce-faq-chatbot-dataset | ---
dataset_info:
features:
- name: a_hant
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: q_hant
dtype: string
splits:
- name: train
num_bytes: 28737
num_examples: 79
download_size: 17499
dataset_size: 28737
---
# Dataset Card for "ecommerce-faq-chatbot-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_T_A_Q_rices_ns_200 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 28843
num_examples: 200
download_size: 14303
dataset_size: 28843
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_T_A_Q_rices_ns_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
corbt/unlabeled-recipies | ---
dataset_info:
features:
- name: recipe
dtype: string
splits:
- name: train
num_bytes: 2793853
num_examples: 5000
download_size: 1465640
dataset_size: 2793853
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "unlabeled-recipies"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anumafzal94/pubmed-2shot-4096 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: few-shot
dtype: bool
splits:
- name: test
num_bytes: 8149116.593446602
num_examples: 426
- name: train
num_bytes: 139802654.7469022
num_examples: 7242
download_size: 20828412
dataset_size: 147951771.3403488
---
# Dataset Card for "pubmed-2shot-4096"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Babelscape/SREDFM | ---
dataset_info:
- config_name: ar
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 659105981
num_examples: 499568
- name: test
num_bytes: 9015516
num_examples: 4387
- name: validation
num_bytes: 7406509
num_examples: 3783
download_size: 3651950669
dataset_size: 675528006
- config_name: ca
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 406179567
num_examples: 294856
- name: test
num_bytes: 5378789
num_examples: 2541
- name: validation
num_bytes: 3136722
num_examples: 1532
download_size: 1513026644
dataset_size: 414695078
- config_name: de
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 1288274676
num_examples: 1049967
- name: test
num_bytes: 10773087
num_examples: 5649
- name: validation
num_bytes: 8955886
num_examples: 4994
download_size: 4521091910
dataset_size: 1308003649
- config_name: el
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 133497910
num_examples: 64221
- name: test
num_bytes: 2364826
num_examples: 861
- name: validation
num_bytes: 1836092
num_examples: 668
download_size: 579372781
dataset_size: 137698828
- config_name: en
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 3555107736
num_examples: 2701389
- name: test
num_bytes: 13160183
num_examples: 6685
- name: validation
num_bytes: 27692074
num_examples: 13236
download_size: 11914987368
dataset_size: 3595959993
- config_name: es
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 888914515
num_examples: 702785
- name: test
num_bytes: 16076382
num_examples: 8561
- name: validation
num_bytes: 4621760
num_examples: 2177
download_size: 3570403740
dataset_size: 909612657
- config_name: fr
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 768697146
num_examples: 870448
- name: test
num_bytes: 5937745
num_examples: 3883
- name: validation
num_bytes: 3233262
num_examples: 2079
download_size: 3269522484
dataset_size: 777868153
- config_name: hi
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 96926984
num_examples: 51900
- name: test
num_bytes: 1340091
num_examples: 374
- name: validation
num_bytes: 1222098
num_examples: 405
download_size: 385810623
dataset_size: 99489173
- config_name: it
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 436879977
num_examples: 432076
- name: test
num_bytes: 3798221
num_examples: 2175
- name: validation
num_bytes: 2230995
num_examples: 1276
download_size: 1685172398
dataset_size: 442909193
- config_name: ja
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 708617436
num_examples: 480785
- name: test
num_bytes: 7802066
num_examples: 3392
- name: validation
num_bytes: 6990637
num_examples: 3106
download_size: 3186065351
dataset_size: 723410139
- config_name: ko
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 266381416
num_examples: 213659
- name: test
num_bytes: 1736809
num_examples: 803
- name: validation
num_bytes: 1857229
num_examples: 917
download_size: 1119778167
dataset_size: 269975454
- config_name: nl
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 695855128
num_examples: 648029
- name: test
num_bytes: 5186584
num_examples: 2715
- name: validation
num_bytes: 4188877
num_examples: 2188
download_size: 2591997126
dataset_size: 705230589
- config_name: pl
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 877441685
num_examples: 675688
- name: test
num_bytes: 11475559
num_examples: 6376
- name: validation
num_bytes: 6618989
num_examples: 3476
download_size: 3365852789
dataset_size: 895536233
- config_name: pt
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 584986936
num_examples: 469347
- name: test
num_bytes: 8678707
num_examples: 4313
- name: validation
num_bytes: 5807293
num_examples: 2973
download_size: 2347987926
dataset_size: 599472936
- config_name: ru
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 604993210
num_examples: 339697
- name: test
num_bytes: 5941158
num_examples: 2296
- name: validation
num_bytes: 5352859
num_examples: 2107
download_size: 2754576893
dataset_size: 616287227
- config_name: sv
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 1822863623
num_examples: 1742082
- name: test
num_bytes: 13002356
num_examples: 7531
- name: validation
num_bytes: 5136097
num_examples: 2987
download_size: 6790489020
dataset_size: 1841002076
- config_name: vi
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 300641174
num_examples: 260010
- name: test
num_bytes: 4304795
num_examples: 1824
- name: validation
num_bytes: 3402120
num_examples: 1461
download_size: 1301938106
dataset_size: 308348089
- config_name: zh
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 449085696
num_examples: 369249
- name: test
num_bytes: 5260974
num_examples: 2667
- name: validation
num_bytes: 3511103
num_examples: 1816
download_size: 2440525684
dataset_size: 457857773
- config_name: all_languages
features:
- name: docid
dtype: string
- name: title
dtype: string
- name: uri
dtype: string
- name: lan
dtype: string
- name: text
dtype: string
- name: entities
list:
- name: uri
dtype: string
- name: surfaceform
dtype: string
- name: type
dtype: string
- name: start
dtype: int32
- name: end
dtype: int32
- name: relations
list:
- name: subject
dtype: int32
- name: predicate
dtype: string
- name: object
dtype: int32
splits:
- name: train
num_bytes: 14615645332
num_examples: 11865756
- name: test
num_bytes: 131636046
num_examples: 67033
- name: validation
num_bytes: 103507688
num_examples: 51181
download_size: 56989165879
dataset_size: 14850789066
task_categories:
- token-classification
language:
- ar
- ca
- de
- el
- en
- es
- fr
- hi
- it
- ja
- ko
- nl
- pl
- pt
- ru
- sv
- vi
- zh
size_categories:
- 10M<n<100M
license: cc-by-sa-4.0
---
# RED<sup>FM</sup>: a Filtered and Multilingual Relation Extraction Dataset
This is the automatically-filtered dataset from the 2023 ACL paper [RED^{FM}: a Filtered and Multilingual Relation Extraction Dataset](https://arxiv.org/abs/2306.09802). If you use the model, please reference this work in your paper:
@inproceedings{huguet-cabot-et-al-2023-redfm-dataset,
title = "RED$^{\rm FM}$: a Filtered and Multilingual Relation Extraction Dataset",
author = "Huguet Cabot, Pere-Llu{\'\i}s and Tedeschi, Simone and Ngonga Ngomo, Axel-Cyrille and
Navigli, Roberto",
booktitle = "Proc. of the 61st Annual Meeting of the Association for Computational Linguistics: ACL 2023",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2306.09802",
}
## License
SRED<sup>FM</sup> is licensed under the CC BY-SA 4.0 license. The text of the license can be found [here](https://creativecommons.org/licenses/by-sa/4.0/). |
DFKI/radr_intents | ---
task_categories:
- text-classification
language:
- de
pretty_name: Intent Classification for Robot Assisted Disaster Response
size_categories:
- 100K<n<1M
---
# Dataset Card for "Intent Classification for Robot Assisted Disaster Response"
<!-- Provide a quick summary of the dataset. -->
This dataset consists of conversations recorded during the training sessions in the emergency response domain.
The conversations are typically between several operators controlling the robots, a team leader and a mission commander.
The data have been transcribed and annotated during the following projects: [TRADR](http://www.tradr-project.eu/) and [ADRZ](https://rettungsrobotik.de/home).
The dialogues are split into turns and each turn is annotated with a speaker and intent.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** DFKI, [Talking Robots Group at MLT](https://www.dfki.de/en/web/research/research-departments/multilinguality-and-language-technology/tr-team)
<!-- - **Funded by [optional]:** [More Information Needed] -->
<!-- - **Shared by [optional]:** [More Information Needed] -->
- **Language(s) (NLP):** German
- **License:** [More Information Needed]
<!-- ### Dataset Sources [optional] -->
<!-- Provide the basic links for the dataset. -->
<!-- - **Repository:** [More Information Needed] -->
<!-- - **Paper [optional]:** [More Information Needed] -->
<!-- - **Demo [optional]:** [More Information Needed] -->
<!-- ## Uses -->
<!-- Address questions around how the dataset is intended to be used. -->
<!-- ### Direct Use -->
<!-- This section describes suitable use cases for the dataset. -->
<!-- [More Information Needed] -->
<!-- ### Out-of-Scope Use -->
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
<!-- [More Information Needed] -->
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
### Data Instances
```
{
'id': '1235',
'speaker': 'UAV',
'text': 'wir haben einmal den Akku gewechselt, bis jetzt noch kein Rauch festzustellen ...',
'label': 2
}
```
### Data Fields
```
id: the id of the dialogue turn, an `int` feature
speaker: the speaker of the turn, a `string` feature
text: the utterance of the turn, a `string` feature
label: the label of the turn, an `int` feature
```
### Data Splits
This dataset contains 3525 dialogue turns in total. The data are split as follows: 2610 turns for training, 310 for development and 605 for test. The data represent a continuous conversation, i.e., the previous id refers to the previous turn in the dialogue.
### Label Description and Statistics
| label | meaning | train | percentage | example |
| --- | --- | --- | --- | --- |
| 0 | disconfirm | 35 | 1.3% | `Ist negativ, noch nicht.` |
| 1 | order | 216 | 8.3% | `Für Sie Erkundungsauftrag: Gesamtüberblick über die Einsatzstelle. Kommen.` |
| 2 | info_provide | 979 | 37.5% | `Ich verlasse das Erdgeschoss und gehe ins erste Obergeschoss.` |
| 3 | info_request | 238 | 9.1% | `Frage: Erkundungsergebnis aus der östlichen Seite des Gebäudes, kommen.` |
| 4 | call | 487 | 18.7% | `RobLW an Zugführer, kommen.` |
| 5 | call_response | 370 | 14.2% | `Ja, hier ist Zugführer, kommen.` |
| 6 | other | 43 | 1.7% | `Einen Augenblick, ich melde mich gleich.` |
| 7 | confirm | 242 | 9.3% | `Ein Lagebild von oben, komplette Lage, und ein Lagebild zwischen den beiden Türen, verstanden.` |
## Dataset Creation
### Curation Rationale
The dataset is based on the recordings from the emergency response domain that use radio communication protocol. The goal of the conversation is to coordinate rescue operations in a robot-assisted disaster response.
### Source Data
The data are based on human-human communication in robot-assisted disaster response. The dialogues are task-oriented, focused on collaborative execution of a mission by a team that uses robots to to explore some area, find hazardous materials, locate fires, damage or victims.
#### Data Collection and Processing
The initial audio recordings were collected during the [TRADR](http://www.tradr-project.eu/) and [ADRZ](https://rettungsrobotik.de/home) projects, transcribed and annotated by the [Talking Robots Group, DFKI](https://www.dfki.de/en/web/research/research-departments/multilinguality-and-language-technology/tr-team)
<!--#### Who are the source data producers?-->
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
### Annotations
The annotations include dialogue intents relevant for communication in the emergency response domain: `call`, `call_response`, `info_request`, `info_provide`, `confirm`, `disconfirm`, `order` and `other`.
Note the interpretation of the intent depends on the context. E.g., the following examples illustrate how very similar responses ("Warten", "Wait") are annotated differently depending on the previous turn:
```
(1) disconfirm
- Können wir weitermachen? (Shall we continue?)
- Warten. (Wait.)
(2) confirm
- Hast du die Möglichkeit, das Fass näher zu identifizieren, was da drin ist? (Can you inspect the barrel closer to identify what is inside?)
- Ja, warten. (Yes, wait.)
(3) order
- Werde aber jetzt auch mal die rückwärtige Seite des Fasses erkunden. (I will inspect now the back side of the barrel.)
- UGV 1, damit warten. (UGV 1, wait.)
(4) other (pausing to check)
- Frage: kommen meine Fotos an? (Question: do you receive my photos?)
- Warten. (Wait.)
```
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
The recordings were manually transcribed and annotated with emergency response intents. There are 3525 dialogue turns in total with 6.3 tokens per turn on average.
#### Who are the annotators?
All annotations were done by the research assistants of the [Talking Robots Group, DFKI](https://www.dfki.de/en/web/research/research-departments/multilinguality-and-language-technology/tr-team)
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
The dataset does not include any real names, addresses or other personal information. The recordings were done during training sessions with simulations of the emergency situation.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The dataset covers only a subset of possible emergency situations, focusing mainly on fire, building collapse and chemical leakage. It does not address many other situations, e.g., traffic accidents, floods or explosions.
<!--### Recommendations -->
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
<!-- Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. -->
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
Part of this dataset has been introduced in the following paper. However, the current version includes more annotated turns due to additional data collection.
**BibTeX:**
```
@inproceedings{anikina-2023-towards,
title = "Towards Efficient Dialogue Processing in the Emergency Response Domain",
author = "Anikina, Tatiana",
editor = "Padmakumar, Vishakh and
Vallejo, Gisela and
Fu, Yao",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-srw.31",
doi = "10.18653/v1/2023.acl-srw.31",
pages = "212--225",
abstract = "In this paper we describe the task of adapting NLP models to dialogue processing in the emergency response domain. Our goal is to provide a recipe for building a system that performs dialogue act classification and domain-specific slot tagging while being efficient, flexible and robust. We show that adapter models Pfeiffer et al. (2020) perform well in the emergency response domain and benefit from additional dialogue context and speaker information. Comparing adapters to standard fine-tuned Transformer models we show that they achieve competitive results and can easily accommodate new tasks without significant memory increase since the base model can be shared between the adapters specializing on different tasks. We also address the problem of scarce annotations in the emergency response domain and evaluate different data augmentation techniques in a low-resource setting.",
}
```
**APA:**
```
Anikina, T. (2023). Towards Efficient Dialogue Processing in the Emergency Response Domain. Annual Meeting of the Association for Computational Linguistics.
```
## Glossary
Abbrevations used for the speakers:
UGV: Unmanned Ground Vehicle
UAV: Unmanned Aerial Vehicle
MC: Mission Commander
TL: Team Leader
RobLW: Robotikleitwagen (robotic lead vehicle)
ZF: Zugführer (fire brigade commander)
GF: Gruppenführer (group leader)
ELW: Einsatzleitwagen (emergency command vehicle)
GW-DUK: Gerätewagen-Daten-und-Kommunikation (vehicle for transporting robots and equipment)
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> |
luzimu/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 10814142
num_examples: 1000
download_size: 3097056
dataset_size: 10814142
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
surabhiMV/qrcode_n | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: bbox
sequence:
sequence:
sequence: float64
splits:
- name: train
num_bytes: 18271607.0
num_examples: 502
download_size: 17289874
dataset_size: 18271607.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qrcode_n"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangshuoming/NumericBench-Eval-small-gpt3.5-zeroshot-result | ---
dataset_info:
features:
- name: c
dtype: string
- name: asm
dtype: string
splits:
- name: train
num_bytes: 390885
num_examples: 400
download_size: 123437
dataset_size: 390885
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "NumericBench-Eval-small-gpt3.5-zeroshot-result"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dliu1/legal-llama-raw-text | ---
license: apache-2.0
---
|
ayan1988/diffusion.7.control_net | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 453988831.0
num_examples: 50000
download_size: 324957581
dataset_size: 453988831.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "diffusion.7.control_net"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713005123 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11224
num_examples: 24
download_size: 9091
dataset_size: 11224
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713005123"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/resh_edu_short_prompts | ---
dataset_info:
features:
- name: solution
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 12371576
num_examples: 2106
download_size: 5361614
dataset_size: 12371576
---
# Dataset Card for "resh_edu_short_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigbio/bionlp_st_2011_id |
---
language:
- en
bigbio_language:
- English
license: other
multilinguality: monolingual
bigbio_license_shortname: GENIA_PROJECT_LICENSE
pretty_name: BioNLP 2011 ID
homepage: https://github.com/openbiocorpora/bionlp-st-2011-id
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- EVENT_EXTRACTION
- COREFERENCE_RESOLUTION
- NAMED_ENTITY_RECOGNITION
---
# Dataset Card for BioNLP 2011 ID
## Dataset Description
- **Homepage:** https://github.com/openbiocorpora/bionlp-st-2011-id
- **Pubmed:** True
- **Public:** True
- **Tasks:** EE,COREF,NER
The dataset of the Infectious Diseases (ID) task of
BioNLP Shared Task 2011.
## Citation Information
```
@inproceedings{pyysalo-etal-2011-overview,
title = "Overview of the Infectious Diseases ({ID}) task of {B}io{NLP} Shared Task 2011",
author = "Pyysalo, Sampo and
Ohta, Tomoko and
Rak, Rafal and
Sullivan, Dan and
Mao, Chunhong and
Wang, Chunxia and
Sobral, Bruno and
Tsujii, Jun{'}ichi and
Ananiadou, Sophia",
booktitle = "Proceedings of {B}io{NLP} Shared Task 2011 Workshop",
month = jun,
year = "2011",
address = "Portland, Oregon, USA",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W11-1804",
pages = "26--35",
}
```
|
autoevaluate/autoeval-staging-eval-project-samsum-db063b78-12135617 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k | ---
pretty_name: Evaluation run of CallComply/zephyr-7b-beta-128k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CallComply/zephyr-7b-beta-128k](https://huggingface.co/CallComply/zephyr-7b-beta-128k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T19:45:35.717294](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k/blob/main/results_2024-01-14T19-45-35.717294.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5337384150834084,\n\
\ \"acc_stderr\": 0.034377622578911936,\n \"acc_norm\": 0.5411488270607204,\n\
\ \"acc_norm_stderr\": 0.03515985681109475,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144915,\n \"mc2\": 0.4609603387456776,\n\
\ \"mc2_stderr\": 0.01568400425776764\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.01455594976049644,\n\
\ \"acc_norm\": 0.5827645051194539,\n \"acc_norm_stderr\": 0.014409825518403084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6016729735112527,\n\
\ \"acc_stderr\": 0.004885529674958333,\n \"acc_norm\": 0.8099980083648676,\n\
\ \"acc_norm_stderr\": 0.003915007231962104\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958217,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958217\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211213,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211213\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.02704574657353433,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.02704574657353433\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756776,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756776\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178263,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178263\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5980392156862745,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.0314506860074486,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.0314506860074486\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.01593668106262856,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.01593668106262856\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931494,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931494\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.01473692638376197,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.01473692638376197\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.02760791408740047,\n\
\ \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.02760791408740047\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n\
\ \"acc_stderr\": 0.012409564470235562,\n \"acc_norm\": 0.3820078226857888,\n\
\ \"acc_norm_stderr\": 0.012409564470235562\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5506535947712419,\n \"acc_stderr\": 0.02012376652802727,\n \
\ \"acc_norm\": 0.5506535947712419,\n \"acc_norm_stderr\": 0.02012376652802727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235926,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235926\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.0344578996436275,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.0344578996436275\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.016185744355144915,\n \"mc2\": 0.4609603387456776,\n\
\ \"mc2_stderr\": 0.01568400425776764\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13040181956027294,\n \
\ \"acc_stderr\": 0.009275630324554092\n }\n}\n```"
repo_url: https://huggingface.co/CallComply/zephyr-7b-beta-128k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|arc:challenge|25_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|gsm8k|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hellaswag|10_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T19-45-35.717294.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- '**/details_harness|winogrande|5_2024-01-14T19-45-35.717294.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T19-45-35.717294.parquet'
- config_name: results
data_files:
- split: 2024_01_14T19_45_35.717294
path:
- results_2024-01-14T19-45-35.717294.parquet
- split: latest
path:
- results_2024-01-14T19-45-35.717294.parquet
---
# Dataset Card for Evaluation run of CallComply/zephyr-7b-beta-128k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/zephyr-7b-beta-128k](https://huggingface.co/CallComply/zephyr-7b-beta-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T19:45:35.717294](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__zephyr-7b-beta-128k/blob/main/results_2024-01-14T19-45-35.717294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5337384150834084,
"acc_stderr": 0.034377622578911936,
"acc_norm": 0.5411488270607204,
"acc_norm_stderr": 0.03515985681109475,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144915,
"mc2": 0.4609603387456776,
"mc2_stderr": 0.01568400425776764
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.01455594976049644,
"acc_norm": 0.5827645051194539,
"acc_norm_stderr": 0.014409825518403084
},
"harness|hellaswag|10": {
"acc": 0.6016729735112527,
"acc_stderr": 0.004885529674958333,
"acc_norm": 0.8099980083648676,
"acc_norm_stderr": 0.003915007231962104
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958217,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958217
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.02375292871211213,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.02375292871211213
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756776,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756776
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178263,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178263
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.02506909438729653,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.02506909438729653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.0314506860074486,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.0314506860074486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.01593668106262856,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.01593668106262856
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931494,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931494
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.01473692638376197,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.01473692638376197
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5617283950617284,
"acc_stderr": 0.02760791408740047,
"acc_norm": 0.5617283950617284,
"acc_norm_stderr": 0.02760791408740047
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3820078226857888,
"acc_stderr": 0.012409564470235562,
"acc_norm": 0.3820078226857888,
"acc_norm_stderr": 0.012409564470235562
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5506535947712419,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.5506535947712419,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235926,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235926
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.0344578996436275,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.0344578996436275
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144915,
"mc2": 0.4609603387456776,
"mc2_stderr": 0.01568400425776764
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
},
"harness|gsm8k|5": {
"acc": 0.13040181956027294,
"acc_stderr": 0.009275630324554092
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jondurbin/airoboros-gpt4 | ---
license: cc-by-nc-4.0
---
The data was generated by gpt-4, and therefore is subject to OpenAI ToS. The tool used to generate the data [airoboros](https://github.com/jondurbin/airoboros) is apache-2.
Specific areas of focus for this training data:
* trivia
* math
* nonsensical math
* coding
* closed context question answering
* closed context question answering, with multiple contexts to choose from as confounding factors
* writing
* multiple choice
### Usage and License Notices
All airoboros models and datasets are intended and licensed for research use only. I've used the 'cc-nc-4.0' license, but really it is subject to a custom/special license because:
- the base model is LLaMa, which has it's own special research license
- the dataset(s) were generated with OpenAI (gpt-4 and/or gpt-3.5-turbo), which has a clausing saying the data can't be used to create models to compete with openai
So, to reiterate: this model (and datasets) cannot be used commercially. |
Felladrin/ChatML-SlimOrca-Dedup | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- text-classification
- question-answering
- text-generation
pretty_name: SlimOrca Dedup
tags:
- code
- art
- music
- legal
- finance
- biology
- chemistry
---
[Open-Orca/SlimOrca-Dedup](https://huggingface.co/datasets/Open-Orca/SlimOrca-Dedup) in ChatML format, ready to use in [HuggingFace TRL's SFT Trainer](https://huggingface.co/docs/trl/main/en/sft_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("Felladrin/Llama-160M-Chat-v1")
dataset = load_dataset("Open-Orca/SlimOrca-Dedup", split="train")
def format(columns):
messages = []
conversations = columns["conversations"]
for i in range(len(conversations)):
message = conversations[i]
content = message["value"]
role = message["from"]
if role == "human":
role = "user"
elif role == "gpt":
role = "assistant"
if role and content:
messages.append(
{
"role": role.strip(),
"content": content.strip(),
}
)
return { "text": tokenizer.apply_chat_template(messages, tokenize=False) }
dataset.map(format).select_columns(['text']).to_parquet("train.parquet")
```
|
ESGBERT/environmental_2k | ---
license: apache-2.0
---
|
Jarmac/llama2_di_dataset_train_prompt | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 162127317
num_examples: 68785
download_size: 77410082
dataset_size: 162127317
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
somosnlp/recetasdelaabuela_genstruct_it | ---
language:
- es
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- question-answering
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: messages
sequence: 'null'
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: conversation
sequence:
sequence: string
splits:
- name: train
num_bytes: 103228164
num_examples: 20085
download_size: 49502853
dataset_size: 103228164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Descripción
Dataset creado para la hackathon #Somos600M con el objetivo de entrenar un modelo que pueda recomendar recetas de paises hispanohablantes.
Este conjunto de datos consiste en pregunta-respuesta y fue elaborado a partir de un contexto usando Genstruct-7B y distilabel.
Elaborado a partir del dataset en crudo [somosnlp/RecetasDeLaAbuela](https://huggingface.co/datasets/somosnlp/RecetasDeLaAbuela) elaborado por el equipo recetasdelaabuela mediante web scraping.
## Origen del Dataset
El dataset se obtuvo mediante web scrapping de estas paginas:
- https://www.elmueble.com/
- https://www.yanuq.com/
- https://www.directoalpaladar.com/
- https://www.recetasgratis.net/
- https://cookpad.com/pe/
## Notebook utilizada
Elaborado con el [colab](https://colab.research.google.com/drive/1-7OY5ORmOw0Uy_uazXDDqjWWkwCKvWbL?usp=sharing).
## Contacto
Si encuentras algún error o tienes una recomendación, por favor hazmelo saber!! El obejtivo es que el dataset siga mejorando en el tiempo, me encuentras en hugging face como @sbenel o comunicate en discord con un miembro del equipo de la hackathon. |
pbaoo2705/processed_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3544789
num_examples: 5000
- name: test
num_bytes: 708063
num_examples: 1000
download_size: 2342034
dataset_size: 4252852
---
# Dataset Card for "processed_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SINAI/Spanish-QC | ---
license: cc-by-nc-sa-4.0
language:
- es
tags:
- Answer Search classification
pretty_name: Spanish-QC
---
### Dataset Description
**Paper**: [BRUJA: Question Classification for Spanish. Using Machine Translation and an English Classifier.](https://aclanthology.org/W06-1906.pdf)
**Point of Contact**: magc@ujaen.es
This resource is 6305 questions in Spanish labeled for Answer Search classification, following the taxonomy defined in the article "X. Li and D. Roth. Learning Question Classifiers", which has the following general and detailed categories:
- ABBR: abbreviation, expansion
- DESC: definition, description, mode, motif
- ENTY: animal, body, color, creation, currency, disease/medical, event, food, instrument, language, letter, other, plant, product, religion, sport, substance, symbol, technique, term, vehicle, word
- HUM: description, group, individual, title
- LOC: city, country, mountain, other, state, other, state
- NUM: code, count, date, distance, distance, money, order, other, percentage, period, speed, temperature, size, weight
Starting from a set of labeled questions for English, this resource has been generated with various questions in Spanish labeled and reviewed by 3 people.
### Acknowledgments
This work has been supported by the Spanish Government (MCYT) with grant TIC2003-07158-C04-04.
### Licensing Information
Spanish-QC is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```bibtex
@inproceedings{a-garcia-cumbreras-etal-2006-bruja,
title = "{BRUJA}: Question Classification for {S}panish. Using Machine Translationand an {E}nglish Classifier",
author = "Garc{\'\i}a Cumbreras, Miguel {\'A}. and
Ure{\~n}a L{\'o}pez, L. Alfonso and
Mart{\'\i}nez Santiago, Fernando",
booktitle = "Proceedings of the Workshop on Multilingual Question Answering - {MLQA} {`}06",
year = "2006",
url = "https://aclanthology.org/W06-1906",
}
``` |
qbao775/PARARULE-Plus-Depth-5 | ---
license: mit
task_categories:
- text-classification
- question-answering
language:
- en
tags:
- Reasoning
- Multi-Step-Deductive-Reasoning
- Logical-Reasoning
size_categories:
- 100K<n<1M
---
# PARARULE-Plus-Depth-5
This is a branch which includes the dataset from PARARULE-Plus Depth=5. PARARULE Plus is a deep multi-step reasoning dataset over natural language. It can be seen as an improvement on the dataset of PARARULE (Peter Clark et al., 2020). Both PARARULE and PARARULE-Plus follow the closed-world assumption and negation as failure. The motivation is to generate deeper PARARULE training samples. We add more training samples for the case where the depth is greater than or equal to two to explore whether Transformer has reasoning ability. PARARULE Plus is a combination of two types of entities, animals and people, and corresponding relationships and attributes. From the depth of 2 to the depth of 5, we have around 100,000 samples in the depth of each layer, and there are nearly 400,000 samples in total.
Here is the original links for PARARULE-Plus including paper, project and data.
Paper: https://www.cs.ox.ac.uk/isg/conferences/tmp-proceedings/NeSy2022/paper15.pdf
Project: https://github.com/Strong-AI-Lab/Multi-Step-Deductive-Reasoning-Over-Natural-Language
Data: https://github.com/Strong-AI-Lab/PARARULE-Plus
PARARULE-Plus has been collected and merged by [LogiTorch.ai](https://www.logitorch.ai/), [ReasoningNLP](https://github.com/FreedomIntelligence/ReasoningNLP), [Prompt4ReasoningPapers](https://github.com/zjunlp/Prompt4ReasoningPapers) and [OpenAI/Evals](https://github.com/openai/evals/pull/651).
In this huggingface version, we pre-processed the dataset and use `1` to represent `true` and `0` to represent `false` to better help user train model.
## How to load the dataset?
```
from datasets import load_dataset
dataset = load_dataset("qbao775/PARARULE-Plus-Depth-5")
```
## How to train a model using the dataset?
We provide an [example](https://github.com/Strong-AI-Lab/PARARULE-Plus/blob/main/README.md#an-example-script-to-load-pararule-plus-and-fine-tune-bert) that you can `git clone` the project and fine-tune the dataset locally.
## Citation
```
@inproceedings{bao2022multi,
title={Multi-Step Deductive Reasoning Over Natural Language: An Empirical Study on Out-of-Distribution Generalisation},
author={Qiming Bao and Alex Yuxuan Peng and Tim Hartill and Neset Tan and Zhenyun Deng and Michael Witbrock and Jiamou Liu},
year={2022},
publisher={The 2nd International Joint Conference on Learning and Reasoning and 16th International Workshop on Neural-Symbolic Learning and Reasoning (IJCLR-NeSy 2022)}
}
``` |
davanstrien/haiku_dop | Invalid username or password. |
sonpv1/test | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 117000525.0
num_examples: 444
download_size: 116736869
dataset_size: 117000525.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
echarlaix/vqa-lxmert | ---
license: apache-2.0
---
|
karan4d/instruct_machiavellian_textbooks | ---
license: apache-2.0
---
credits: shoutout @vikp for his textbook_quality GH repo this was created with
dataset info: a bunch of bad boy data for Machiavellian LLMs |
Ingrid0693/bert_train_val | ---
license: mit
dataset_info:
features:
- name: id
dtype: int64
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: is_impossible
dtype: bool
splits:
- name: train
num_bytes: 65160295
num_examples: 2019
- name: validation
num_bytes: 39069457
num_examples: 1212
download_size: 5496786
dataset_size: 104229752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/konngara_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of konngara (Touhou)
This is the dataset of konngara (Touhou), containing 89 images and their tags.
The core tags of this character are `horns, single_horn, red_eyes, black_hair, ponytail, long_hair, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 89 | 77.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 89 | 54.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 171 | 101.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 89 | 72.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 171 | 127.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/konngara_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/konngara_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, japanese_clothes, katana, solo, profile, sheath |
| 1 | 6 |  |  |  |  |  | 1girl, sakazuki, solo, wide_sleeves, katana, kimono, hair_bow, looking_at_viewer, holding |
| 2 | 12 |  |  |  |  |  | 1girl, holding_sword, solo, katana, long_sleeves, looking_at_viewer, wide_sleeves, closed_mouth, hair_ribbon, bangs, holding_cup, sakazuki, red_kimono, red_ribbon, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | japanese_clothes | katana | solo | profile | sheath | sakazuki | wide_sleeves | kimono | hair_bow | looking_at_viewer | holding | holding_sword | long_sleeves | closed_mouth | hair_ribbon | bangs | holding_cup | red_kimono | red_ribbon | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:---------|:-------|:----------|:---------|:-----------|:---------------|:---------|:-----------|:--------------------|:----------|:----------------|:---------------|:---------------|:--------------|:--------|:--------------|:-------------|:-------------|:--------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | | | X | X | X | X | X | X | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | X | X | | | X | X | | | X | | X | X | X | X | X | X | X | X | X |
|
SEACrowd/kamus_alay | ---
license: unknown
tags:
- morphological-inflection
language:
- ind
---
# kamus_alay
Kamus Alay provide a lexicon for text normalization of Indonesian colloquial words.
It contains 3,592 unique colloquial words-also known as “bahasa alay” -and manually annotated them
with the normalized form. We built this lexicon from Instagram comments provided by Septiandri & Wibisono (2017)
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@INPROCEEDINGS{8629151,
author={Aliyah Salsabila, Nikmatun and Ardhito Winatmoko, Yosef and Akbar Septiandri, Ali and Jamal, Ade},
booktitle={2018 International Conference on Asian Language Processing (IALP)},
title={Colloquial Indonesian Lexicon},
year={2018},
volume={},
number={},
pages={226-229},
doi={10.1109/IALP.2018.8629151}}
```
## License
Unknown
## Homepage
[https://ieeexplore.ieee.org/abstract/document/8629151](https://ieeexplore.ieee.org/abstract/document/8629151)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
jakartaresearch/id-review-gen | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- id
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
pretty_name: id-review-gen
tags:
- product review
- review
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 13161717
num_examples: 105324
download_size: 13161717
dataset_size: 13161717
---
|
wza/FinVis | ---
license: apache-2.0
---
Dataset for paper: FinVis-GPT: A Multimodal Large Language Model for Financial Chart Analysis( https://github.com/wwwadx/FinVis-GPT )
The .zip file contains images |
zelihami/nlpfinal | ---
dataset_info:
features:
- name: metin
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 38685.92380952381
num_examples: 94
- name: validation
num_bytes: 4527.07619047619
num_examples: 11
download_size: 36066
dataset_size: 43213.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
deepapaikar/Katzbot_QA_pairs_2col | ---
license: apache-2.0
---
|
SumanMondal/bengali_qa | ---
license: apache-2.0
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 278640306
num_examples: 124886
download_size: 28941728
dataset_size: 278640306
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hanruijiang/civitai-stable-diffusion-2.5m | ---
license: apache-2.0
task_categories:
- text-generation
- text-to-image
language:
- en
tags:
- art
size_categories:
- 1M<n<10M
---
inspired by thefcraft/civitai-stable-diffusion-337k.
collected using civitai api to get all prompts. |
cakiki/test_images | ---
dataset_info:
features:
- name: adjective
dtype: string
- name: profession
dtype: string
- name: 'no'
dtype: int32
- name: image_path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 2362623.085768742
num_examples: 3
download_size: 1727393
dataset_size: 2362623.085768742
---
# Dataset Card for "test_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/sharegptPIPPA | ---
license: mit
---
|
FINNUMBER/FINCH_TRAIN_400 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 19014039
num_examples: 4800
download_size: 10040647
dataset_size: 19014039
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ceyase/audio-diffusion-touhou | ---
license: gpl-3.0
---
|
lrana/MMLU_ita | ---
task_categories:
- zero-shot-classification
- text-classification
- question-answering
- text-generation
language:
- it
tags:
- chemistry
- biology
- legal
- finance
- music
- code
- medical
pretty_name: MMLU Italian Version
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nrtf/exp-gan | ---
license: cc-by-nc-sa-4.0
---
|
JLB-JLB/seizure_eeg_greyscale_224x224_6secWindow_adjusted | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: image
dtype: image
- name: epoch
dtype: int64
- name: label
dtype:
class_label:
names:
'0': seiz
'1': bckg
splits:
- name: train
num_bytes: 2785881214.663499
num_examples: 93962
- name: eval
num_bytes: 446792667.3100732
num_examples: 14910
- name: dev
num_bytes: 11628715785.0
num_examples: 390190
download_size: 7728884651
dataset_size: 14861389666.973572
---
# Dataset Card for "seizure_eeg_greyscale_224x224_6secWindow_adjusted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ogbrandt/pjf_llama_instruction_prep | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 274039
num_examples: 536
download_size: 140067
dataset_size: 274039
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/miyamoto_frederica_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of miyamoto_frederica/宮本フレデリカ/미야모토프레데리카 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of miyamoto_frederica/宮本フレデリカ/미야모토프레데리카 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `blonde_hair, short_hair, green_eyes, bangs, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 754.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyamoto_frederica_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 419.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyamoto_frederica_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1169 | 876.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyamoto_frederica_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 659.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyamoto_frederica_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1169 | 1.26 GiB | [Download](https://huggingface.co/datasets/CyberHarem/miyamoto_frederica_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/miyamoto_frederica_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, simple_background, smile, bare_shoulders, collarbone, upper_body, white_background, off_shoulder, sweater, :3, closed_mouth |
| 1 | 5 |  |  |  |  |  | 1girl, blush, bridal_veil, collarbone, earrings, looking_at_viewer, smile, solo, wedding_dress, white_dress, braid, bride, cleavage, pearl_necklace, upper_body, asymmetrical_hair, bouquet, closed_mouth, tiara, white_background, white_rose, bare_shoulders, floral_print, hair_between_eyes, open_mouth, see-through |
| 2 | 6 |  |  |  |  |  | black_gloves, blush, brooch, butterfly_hair_ornament, long_sleeves, looking_at_viewer, 1girl, ascot, braid, corset, frills, smile, solo, bow, jacket, lace_gloves, parted_lips, ribbon, sitting, thigh_strap, belt, heart_hair_ornament, lace_trim, shiny_hair, shirt, short_shorts |
| 3 | 15 |  |  |  |  |  | 1girl, black_gloves, hat, looking_at_viewer, solo, smile, sleeveless, striped, blush, dress, skirt, corset, pink_headwear, heart_hair_ornament, open_mouth, black_necktie, floral_print, frills, garter_straps, thighhighs |
| 4 | 10 |  |  |  |  |  | 1girl, black_gloves, head_wings, looking_at_viewer, solo, bare_shoulders, frills, maid_headdress, apron, arm_garter, center_opening, smile, blush, lace_trim, black_ribbon, bow, cleavage_cutout, lace_gloves, large_breasts, open_mouth, parted_lips, pink_wings, ribbon_trim, simple_background, sleeveless_dress, upper_body, white_background, chocolate, cross-laced_clothes, demon_wings, hair_ribbon, heart_hair_ornament |
| 5 | 6 |  |  |  |  |  | 1girl, heart, solo, cleavage, detached_collar, frills, hair_bow, looking_at_viewer, maid_headdress, smile, wrist_cuffs, apron, blush, braid, detached_sleeves, pink_bow, neck_ribbon, one_eye_closed, simple_background |
| 6 | 5 |  |  |  |  |  | 1girl, apron, maid_headdress, smile, solo, cleavage, thighhighs, tongue_out, blush, one_eye_closed, bow, cupcake, garter_straps, looking_at_viewer |
| 7 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, braided_bangs, earrings, looking_at_viewer, sleeveless_dress, solo, feathers, frills, plaid_dress, smile, beret, black_headwear, black_dress, bow, closed_mouth, hair_ornament |
| 8 | 6 |  |  |  |  |  | 1girl, beret, bow, bracelet, earrings, looking_at_viewer, nail_polish, sleeveless_dress, smile, solo, armlet, bare_shoulders, black_headwear, blush, braid, grey_dress, fishnet_pantyhose, high_heels, pink_nails, plaid_dress, arm_up, armpits, frills, standing, wrist_cuffs |
| 9 | 7 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, smile, solo, necklace, open_mouth, pink_dress, bare_shoulders, blush, rose, collarbone, frills, one_eye_closed, pink_flower, strapless_dress, white_gloves, ;d, elbow_gloves, feathers, hat_flower, petals |
| 10 | 5 |  |  |  |  |  | 1girl, earrings, hair_flower, looking_at_viewer, blush, bracelet, necklace, cleavage, nail_polish, red_dress, rose, smile, upper_body, coat, collarbone, heart, holding, one_eye_closed, pink_nails, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | simple_background | smile | bare_shoulders | collarbone | upper_body | white_background | off_shoulder | sweater | :3 | closed_mouth | bridal_veil | earrings | wedding_dress | white_dress | braid | bride | cleavage | pearl_necklace | asymmetrical_hair | bouquet | tiara | white_rose | floral_print | hair_between_eyes | open_mouth | see-through | black_gloves | brooch | butterfly_hair_ornament | long_sleeves | ascot | corset | frills | bow | jacket | lace_gloves | parted_lips | ribbon | sitting | thigh_strap | belt | heart_hair_ornament | lace_trim | shiny_hair | shirt | short_shorts | hat | sleeveless | striped | dress | skirt | pink_headwear | black_necktie | garter_straps | thighhighs | head_wings | maid_headdress | apron | arm_garter | center_opening | black_ribbon | cleavage_cutout | large_breasts | pink_wings | ribbon_trim | sleeveless_dress | chocolate | cross-laced_clothes | demon_wings | hair_ribbon | heart | detached_collar | hair_bow | wrist_cuffs | detached_sleeves | pink_bow | neck_ribbon | one_eye_closed | tongue_out | cupcake | braided_bangs | feathers | plaid_dress | beret | black_headwear | black_dress | hair_ornament | bracelet | nail_polish | armlet | grey_dress | fishnet_pantyhose | high_heels | pink_nails | arm_up | armpits | standing | necklace | pink_dress | rose | pink_flower | strapless_dress | white_gloves | ;d | elbow_gloves | hat_flower | petals | hair_flower | red_dress | coat | holding | solo_focus |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:-------|:--------|:--------------------|:--------|:-----------------|:-------------|:-------------|:-------------------|:---------------|:----------|:-----|:---------------|:--------------|:-----------|:----------------|:--------------|:--------|:--------|:-----------|:-----------------|:--------------------|:----------|:--------|:-------------|:---------------|:--------------------|:-------------|:--------------|:---------------|:---------|:--------------------------|:---------------|:--------|:---------|:---------|:------|:---------|:--------------|:--------------|:---------|:----------|:--------------|:-------|:----------------------|:------------|:-------------|:--------|:---------------|:------|:-------------|:----------|:--------|:--------|:----------------|:----------------|:----------------|:-------------|:-------------|:-----------------|:--------|:-------------|:-----------------|:---------------|:------------------|:----------------|:-------------|:--------------|:-------------------|:------------|:----------------------|:--------------|:--------------|:--------|:------------------|:-----------|:--------------|:-------------------|:-----------|:--------------|:-----------------|:-------------|:----------|:----------------|:-----------|:--------------|:--------|:-----------------|:--------------|:----------------|:-----------|:--------------|:---------|:-------------|:--------------------|:-------------|:-------------|:---------|:----------|:-----------|:-----------|:-------------|:-------|:--------------|:------------------|:---------------|:-----|:---------------|:-------------|:---------|:--------------|:------------|:-------|:----------|:-------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | | | | X | X | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | X | | X | | | | | | X | X | | X | X | | | | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | | X | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | X | X | | X | X | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | X | X | X | | X | X | X | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | X | | X | | X | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | X | X | | | | | X | | | | X | | X | | | | | | | | X | X | X | X | X |
|
cmammides/BIOMON | ---
license: cc
---
|
3ee/regularization-landscape | ---
license: mit
tags:
- stable-diffusion
- regularization-images
- text-to-image
- image-to-image
- dreambooth
- class-instance
- preservation-loss-training
---
# Landscape Regularization Images
A collection of regularization & class instance datasets of landscapes for the Stable Diffusion 1.5 to use for DreamBooth prior preservation loss training. |
thibaud-perrin/hibo-function-calling-v1 | ---
language:
- en
license: mit
task_categories:
- text-generation
pretty_name: Hibo Function Calling V1
tags:
- function-calling
- fine-tuning
- text-generation
datasets:
- gathnex/Gath_baize
- glaiveai/glaive-function-calling-v2
dataset_info:
features:
- name: dataset_origin
dtype: string
- name: system
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chat
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 487733895.0
num_examples: 323271
download_size: 211122522
dataset_size: 487733895.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# hibo-function-calling-v1
<div align="center">
<img src="./img/banner.webp" width="100%" />
</div>
[](https://github.com/thibaud-perrin/hibo-mistral-7b-fc)
## 📖 Dataset Description
This dataset, named "hibo-function-calling-v1", is designed to facilitate the fine-tuning of Large Language Models (LLMs) for function calling tasks. It comprises a single 'train' split containing 323,271 data points across three columns: 'dataset_origin', 'system', and 'chat'.
The dataset is a result of merging two distinct sources: `gathnex/Gath_baize` and `glaiveai/glaive-function-calling-v2`, with an aim to provide a comprehensive foundation for training models that can understand and generate function calls in a conversational context. The 'chat_sample' column from `gathnex/Gath_baize` has been split into two separate columns ('chat' and 'system') to better align with the structure conducive to LLM training. Additionally, the 'dataset_origin' column has been introduced (inside `gathnex/Gath_baize`) to track the source of each data entry, enhancing traceability and dataset integrity.
## 🎯 Dataset Goal
The primary objective of this dataset is to empower researchers and developers in the field of AI and machine learning to fine-tune LLMs for improved performance in function calling scenarios. By providing a rich set of conversational exchanges coupled with system interactions, the dataset aims to facilitate the development of models capable of understanding nuanced instructions and executing function calls within a conversational framework.
## 📈 Dataset Structure
### Data Fields
- `dataset_origin`: Indicates the source of the data point, with values representing either `stackoverflow`, `alpaca`, `quora`, `medical` or `glaiveai/glaive-function-calling-v2`.
- `system`: Contains the AI assistant system instruction.
- `chat`: Contains the AI assistant and user messages from the conversational exchanges.
### Data Splits
The dataset contains only one split:
- `train`: 323,271 data points.
## 🔄 Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The dataset was created by merging two datasets: `gathnex/Gath_baize` and `glaiveai/glaive-function-calling-v2`. The 'chat_sample' column from `gathnex/Gath_baize` was meticulously split into 'chat' and 'system' columns to maintain consistency with the dataset structure and objectives. The 'dataset_origin' column was added to ensure transparency and traceability regarding the data's source.
### Who are the source language producers?
The source data comes from the conversational interactions collected and curated within the `gathnex/Gath_baize` and `glaiveai/glaive-function-calling-v2` datasets, encompassing a wide range of conversational contexts and system interactions.
## 📋 Usage
This dataset is intended for use in training and fine-tuning LLMs for function calling tasks within conversational AI systems. It can be leveraged to enhance the model's ability to parse and execute function calls based on user inputs, thereby improving the interactive capabilities of AI assistants and similar applications.
## 📚 Citation
Please cite this dataset using the following BibTeX entry:
```bibtex
@misc{hibo-function-calling-v1,
author = Thibaud Perrin,
title = hibo-function-calling-v1: A Dataset for Function Calling in Conversational AI,
year = 2024,
publisher = Hugging Face,
}
```
## 📖 Acknowledgements
This dataset was developed by merging data from [`gathnex/Gath_baize`](https://huggingface.co/datasets/gathnex/Gath_baize) and [`glaiveai/glaive-function-calling-v2`](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2). We extend our gratitude to the creators and contributors of these datasets for providing the foundational data necessary for creating `hibo-function-calling-v1`.
|
nietras/llm.bin | ---
license: mit
---
|
lixugang/fullsmall | ---
license: apache-2.0
---
|
teowu/LLDescribe-QBench | ---
license: cc-by-nc-sa-4.0
---
Dataset for Paper: **Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision**.
See Github: https://github.com/vqassessment/q-bench.
Feel free to cite us.
```bibtex
@article{wu2023qbench,
title={Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision},
author={Wu, Haoning and Zhang, Zicheng and Zhang, Erli and Chen, Chaofeng and Liao, Liang and Wang, Annan and Li, Chunyi and Sun, Wenxiu and Yan, Qiong and Zhai, Guangtao and Lin, Weisi},
year={2023},
eprint={2309.14181},
}
``` |
z-uo/squad-it | ---
language:
- it
multilinguality:
- monolingual
size_categories:
- 8k<n<10k
task_categories:
- question-answering
task_ids:
- extractive-qa
---
# Squad-it
This dataset is an adapted version of that [squad-it](https://github.com/crux82/squad-it) to train on HuggingFace models.
It contains:
- train samples: 87599
- test samples : 10570
This dataset is for question answering and his format is the following:
```
[
{
"answers": [
{
"answer_start": [1],
"text": ["Questo è un testo"]
},
],
"context": "Questo è un testo relativo al contesto.",
"id": "1",
"question": "Questo è un testo?",
"title": "train test"
}
]
```
It can be used to train many models like T5, Bert, Distilbert... |
iloraishaque/bronte-book-full | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2730899
num_examples: 1
download_size: 1737542
dataset_size: 2730899
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
one-sec-cv12/chunk_268 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20271698784.75
num_examples: 211058
download_size: 18222530794
dataset_size: 20271698784.75
---
# Dataset Card for "chunk_268"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ppxscal/academic_embeddings_test | ---
dataset_info:
features:
- name: Query Text
dtype: string
- name: Ranking 1
dtype: string
- name: Ranking 2
dtype: string
- name: Ranking 3
dtype: string
- name: Ranking 4
dtype: string
- name: Ranking 5
dtype: string
- name: Ranking 6
dtype: string
- name: Ranking 7
dtype: string
- name: Ranking 8
dtype: string
- name: Ranking 9
dtype: string
- name: Ranking 10
dtype: string
- name: Ranking 11
dtype: string
- name: Ranking 12
dtype: string
- name: Ranking 13
dtype: string
- name: score_0
dtype: float64
- name: score_1
dtype: float64
- name: score_2
dtype: float64
- name: score_3
dtype: float64
- name: score_4
dtype: float64
- name: score_5
dtype: float64
- name: score_6
dtype: float64
- name: score_7
dtype: float64
- name: score_8
dtype: float64
- name: score_9
dtype: float64
- name: score_10
dtype: float64
- name: score_11
dtype: float64
- name: score_12
dtype: float64
- name: score_13
dtype: float64
splits:
- name: train
num_bytes: 2219576903
num_examples: 120639
download_size: 160258762
dataset_size: 2219576903
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Indic-Benchmark/tamil-arc-c-2.5k | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
struct:
- name: choices
list:
- name: label
dtype: string
- name: text
dtype: string
- name: stem
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 2224331
num_examples: 2547
download_size: 777541
dataset_size: 2224331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DynamicSuperbPrivate/SpeechTextMatching_LibrispeechTrainClean100 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: text
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 6378650249.671
num_examples: 28539
- name: validation
num_bytes: 348628035.844
num_examples: 2703
download_size: 6779588288
dataset_size: 6727278285.514999
---
# Dataset Card for "speechTextMatching_LibrispeechTrainClean100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
moyix/debian_csrc | ---
license: mit
---
|
one-sec-cv12/chunk_174 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21068705088.5
num_examples: 219356
download_size: 19201696100
dataset_size: 21068705088.5
---
# Dataset Card for "chunk_174"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
centroIA/MistralInstructScenariosv2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2687418
num_examples: 967
download_size: 698118
dataset_size: 2687418
---
# Dataset Card for "MistralInstructScenariosv2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11 | ---
pretty_name: Evaluation run of Undi95/Mistral-11B-TestBench11
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/Mistral-11B-TestBench11](https://huggingface.co/Undi95/Mistral-11B-TestBench11)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T01:59:23.177639](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11/blob/main/results_2023-10-28T01-59-23.177639.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02904781879194631,\n\
\ \"em_stderr\": 0.0017198688690203193,\n \"f1\": 0.09573615771812093,\n\
\ \"f1_stderr\": 0.0021674728464020697,\n \"acc\": 0.463391282649971,\n\
\ \"acc_stderr\": 0.010754512266719978\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.02904781879194631,\n \"em_stderr\": 0.0017198688690203193,\n\
\ \"f1\": 0.09573615771812093,\n \"f1_stderr\": 0.0021674728464020697\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14935557240333586,\n \
\ \"acc_stderr\": 0.00981809072372729\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/Mistral-11B-TestBench11
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T01_59_23.177639
path:
- '**/details_harness|drop|3_2023-10-28T01-59-23.177639.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T01-59-23.177639.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T01_59_23.177639
path:
- '**/details_harness|gsm8k|5_2023-10-28T01-59-23.177639.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T01-59-23.177639.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-08-34.702863.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T20-08-34.702863.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T01_59_23.177639
path:
- '**/details_harness|winogrande|5_2023-10-28T01-59-23.177639.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T01-59-23.177639.parquet'
- config_name: results
data_files:
- split: 2023_10_11T20_08_34.702863
path:
- results_2023-10-11T20-08-34.702863.parquet
- split: 2023_10_28T01_59_23.177639
path:
- results_2023-10-28T01-59-23.177639.parquet
- split: latest
path:
- results_2023-10-28T01-59-23.177639.parquet
---
# Dataset Card for Evaluation run of Undi95/Mistral-11B-TestBench11
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Mistral-11B-TestBench11
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Mistral-11B-TestBench11](https://huggingface.co/Undi95/Mistral-11B-TestBench11) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T01:59:23.177639](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11/blob/main/results_2023-10-28T01-59-23.177639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02904781879194631,
"em_stderr": 0.0017198688690203193,
"f1": 0.09573615771812093,
"f1_stderr": 0.0021674728464020697,
"acc": 0.463391282649971,
"acc_stderr": 0.010754512266719978
},
"harness|drop|3": {
"em": 0.02904781879194631,
"em_stderr": 0.0017198688690203193,
"f1": 0.09573615771812093,
"f1_stderr": 0.0021674728464020697
},
"harness|gsm8k|5": {
"acc": 0.14935557240333586,
"acc_stderr": 0.00981809072372729
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EleutherAI/quirky_multiplication_increment0_bob_hard | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 3174803.002375
num_examples: 48012
- name: validation
num_bytes: 64804.215
num_examples: 980
- name: test
num_bytes: 64855.3815
num_examples: 981
download_size: 1086712
dataset_size: 3304462.598875
---
# Dataset Card for "quirky_multiplication_increment0_bob_hard"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crodri/ccma_meteo_instruct | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- ca
license:
- mit
multilinguality:
- monolingual
pretty_name: ccma_meteo_instruct
size_categories:
- unknown
source_datasets: []
task_categories: []
task_ids: []
---
# Dataset Card for CEIL
## Dataset Description
- **Website:** https://aina.bsc.es
- **Point of Contact:** [Carlos Rodríguez-Penagos](carlos.rodriguez1@bsc.es)
### Dataset Summary
NERC for understanding meteorological queries for an AI assistant
This dataset was developed by [BSC LangTech Unit](https://langtech.bsc.es/) as part of the [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina/), to enrich the [Catalan Language Understanding Benchmark (CLUB)](https://club.aina.bsc.es/).
### Supported Tasks and Leaderboards
Named Entities Recognition, Language Model
### Languages
The dataset is in Catalan (`ca-CA`).
## Dataset Structure
### Data Instances
Three two-column files, one for each split.
<pre>
Com O
serà O
a O
l O
mati interval
el O
temps O
a O
O location
Grove location
el O
dijous day
? O
</pre>
### Data Fields
Every file has two columns, with the word form or punctuation symbol in the first one and the corresponding IOB tag in the second one.
### Data Splits
85/15 Train and development sets, balanced for all NERC tags.
## Dataset Creation
### Curation Rationale
We created this corpus to contribute to the development of language models in Catalan.
### Source Data
Synthetic data
#### Initial Data Collection and Normalization
The word tokenization used to convert offset annotations into CONLL files was done using spacy
#### Who are the source language producers?
### Annotations
#### Annotation process
We adapted the NER labels from to a token-per-line, multi-column format.
#### Who are the annotators?
Original annotators from
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this corpus contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/en/inici/index.html) within the framework of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina/).
### Licensing information
This work is licensed under a <a rel="license" href="https://creativecommons.org/licenses/by/4.0/">Attribution 4.0 International License</a>.
### Citation Information
```
```
### Contributions
[N/A]
|
Bertievidgen/SimpleSafetyTests | ---
license: cc-by-2.0
description: >-
SimpleSafetyTests is a test suite for rapidly and systematically identifying
critical safety risks in language models. It can be used to evaluate the
safety of your LLM, as demonstrated in our paper
https://arxiv.org/abs/2311.08370.
suite:
- name: SimpleSafetyTests
prompts: 100
harm_areas:
- Suicide, Self-Harm and Eating Disorders
- Physical Harm
- Illegal and Highly Regulated Items
- Scams and Fraud
- Child Abuse
caution: >-
The prompts are sensitive and you could find them harmful. For the vast
majority of applications, LLMs should refuse to comply with all of them.
task_categories:
- text-generation
language:
- en
pretty_name: SimpleSafetyTests
size_categories:
- n<1K
--- |
nuvocare/WikiMedical_sentence_similarity | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text1
dtype: string
- name: text2
dtype: string
- name: label
dtype:
class_label:
names:
'0': '-1'
'1': '1'
splits:
- name: train
num_bytes: 150266647.47592032
num_examples: 50712
- name: test
num_bytes: 64403801.52407967
num_examples: 21735
download_size: 129675237
dataset_size: 214670449.0
---
# Dataset Card for "WikiMedical_sentence_similarity"
WikiMedical_sentence_similarity is an adapted and ready-to-use sentence similarity dataset based on [this dataset](https://huggingface.co/datasets/gamino/wiki_medical_terms).
The preprocessing followed three steps:
- Each text is splitted into sentences of 256 tokens (nltk tokenizer)
- Each sentence is paired with a positive pair if found, and a negative one. Negative one are drawn randomly in the whole dataset.
- Train and test split correspond to 70%/30%
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d6b9c357 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 38
num_examples: 2
download_size: 1272
dataset_size: 38
---
# Dataset Card for "d6b9c357"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CTS-Diagnostics/slack_support_bot | ---
license: mit
---
|
premky85/test-textual-inversion | ---
license: afl-3.0
---
|
SaiedAlshahrani/MASD | ---
license: mit
language:
- ar
pretty_name: MASD
size_categories:
- n<1K
---
# Dataset Card for "Masked Arab States Dataset (MASD)"
This dataset is created using 20 Arab States<sup>1</sup> with their corresponding capital cities, nationalities, currencies, and on which continents they are located, consisting of four categories: country-capital
prompts, country-currency prompts, country-nationality prompts, and country-continent prompts. Each prompts category has 40 masked prompts, and the total number of masked prompts in the MASD dataset is 160. This dataset is used to evaluate these Arabic Masked Language Models (MLMs):
1. [SaiedAlshahrani/arwiki_20230101_roberta_mlm_bots](https://huggingface.co/SaiedAlshahrani/arwiki_20230101_roberta_mlm_bots).
2. [SaiedAlshahrani/arwiki_20230101_roberta_mlm_nobots](https://huggingface.co/SaiedAlshahrani/arwiki_20230101_roberta_mlm_nobots).
3. [SaiedAlshahrani/arzwiki_20230101_roberta_mlm](https://huggingface.co/SaiedAlshahrani/arzwiki_20230101_roberta_mlm).
4. [SaiedAlshahrani/arywiki_20230101_roberta_mlm_bots](https://huggingface.co/SaiedAlshahrani/arywiki_20230101_roberta_mlm_bots).
5. [SaiedAlshahrani/arywiki_20230101_roberta_mlm_nobots](https://huggingface.co/SaiedAlshahrani/arywiki_20230101_roberta_mlm_nobots).
For more details about the dataset, please **read** and **cite** our paper:
```bash
@inproceedings{alshahrani-etal-2023-performance,
title = "{Performance Implications of Using Unrepresentative Corpora in {A}rabic Natural Language Processing}",
author = "Alshahrani, Saied and Alshahrani, Norah and Dey, Soumyabrata and Matthews, Jeanna",
booktitle = "Proceedings of the The First Arabic Natural Language Processing Conference (ArabicNLP 2023)",
month = December,
year = "2023",
address = "Singapore (Hybrid)",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.arabicnlp-1.19",
doi = "10.18653/v1/2023.arabicnlp-1.19",
pages = "218--231",
abstract = "Wikipedia articles are a widely used source of training data for Natural Language Processing (NLP) research, particularly as corpora for low-resource languages like Arabic. However, it is essential to understand the extent to which these corpora reflect the representative contributions of native speakers, especially when many entries in a given language are directly translated from other languages or automatically generated through automated mechanisms. In this paper, we study the performance implications of using inorganic corpora that are not representative of native speakers and are generated through automated techniques such as bot generation or automated template-based translation. The case of the Arabic Wikipedia editions gives a unique case study of this since the Moroccan Arabic Wikipedia edition (ARY) is small but representative, the Egyptian Arabic Wikipedia edition (ARZ) is large but unrepresentative, and the Modern Standard Arabic Wikipedia edition (AR) is both large and more representative. We intrinsically evaluate the performance of two main NLP upstream tasks, namely word representation and language modeling, using word analogy evaluations and fill-mask evaluations using our two newly created datasets: Arab States Analogy Dataset (ASAD) and Masked Arab States Dataset (MASD). We demonstrate that for good NLP performance, we need both large and organic corpora; neither alone is sufficient. We show that producing large corpora through automated means can be a counter-productive, producing models that both perform worse and lack cultural richness and meaningful representation of the Arabic language and its native speakers.",
}
```
<sub>1. We only drop two Arab states: the United Arab Emirates (الإمارات العربية المتحدة) and Comoros (جزر القمر), because they or their capital cities are written as open compound words (two words), which cannot be directly handled by the word embedding models, like Abu Dhabi (أبو ظبي).</sub> |
Verne/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 828898.0
num_examples: 20
download_size: 827203
dataset_size: 828898.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yn01/test_20240109_01 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1740
num_examples: 11
download_size: 2391
dataset_size: 1740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-e08cac-1731660420 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: gpt2
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: gpt2
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@tomekkorbak](https://huggingface.co/tomekkorbak) for evaluating this model. |
open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base | ---
pretty_name: Evaluation run of LoSboccacc/orthogonal-2x7B-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LoSboccacc/orthogonal-2x7B-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T21:21:27.618218](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base/blob/main/results_2024-01-16T21-21-27.618218.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6260063657620419,\n\
\ \"acc_stderr\": 0.03286232914053458,\n \"acc_norm\": 0.6295442881937665,\n\
\ \"acc_norm_stderr\": 0.033515089160485206,\n \"mc1\": 0.49326805385556916,\n\
\ \"mc1_stderr\": 0.017501914492655386,\n \"mc2\": 0.6600496157622183,\n\
\ \"mc2_stderr\": 0.015282722255268989\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349812,\n\
\ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817836\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6698864767974507,\n\
\ \"acc_stderr\": 0.004692926794268465,\n \"acc_norm\": 0.8554072893845848,\n\
\ \"acc_norm_stderr\": 0.0035097096477918416\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n\
\ \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n\
\ \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n\
\ \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n\
\ \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\"\
: {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n\
\ \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.02686020644472435,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.02686020644472435\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612893,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612893\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073318,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073318\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188943,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49326805385556916,\n\
\ \"mc1_stderr\": 0.017501914492655386,\n \"mc2\": 0.6600496157622183,\n\
\ \"mc2_stderr\": 0.015282722255268989\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838232\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5079605761940864,\n \
\ \"acc_stderr\": 0.013770739063135374\n }\n}\n```"
repo_url: https://huggingface.co/LoSboccacc/orthogonal-2x7B-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|arc:challenge|25_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|gsm8k|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hellaswag|10_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T21-21-27.618218.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- '**/details_harness|winogrande|5_2024-01-16T21-21-27.618218.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T21-21-27.618218.parquet'
- config_name: results
data_files:
- split: 2024_01_16T21_21_27.618218
path:
- results_2024-01-16T21-21-27.618218.parquet
- split: latest
path:
- results_2024-01-16T21-21-27.618218.parquet
---
# Dataset Card for Evaluation run of LoSboccacc/orthogonal-2x7B-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LoSboccacc/orthogonal-2x7B-base](https://huggingface.co/LoSboccacc/orthogonal-2x7B-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T21:21:27.618218](https://huggingface.co/datasets/open-llm-leaderboard/details_LoSboccacc__orthogonal-2x7B-base/blob/main/results_2024-01-16T21-21-27.618218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6260063657620419,
"acc_stderr": 0.03286232914053458,
"acc_norm": 0.6295442881937665,
"acc_norm_stderr": 0.033515089160485206,
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655386,
"mc2": 0.6600496157622183,
"mc2_stderr": 0.015282722255268989
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349812,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817836
},
"harness|hellaswag|10": {
"acc": 0.6698864767974507,
"acc_stderr": 0.004692926794268465,
"acc_norm": 0.8554072893845848,
"acc_norm_stderr": 0.0035097096477918416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.02486499515976775,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.02486499515976775
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612893,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073318,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073318
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188943,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49326805385556916,
"mc1_stderr": 0.017501914492655386,
"mc2": 0.6600496157622183,
"mc2_stderr": 0.015282722255268989
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838232
},
"harness|gsm8k|5": {
"acc": 0.5079605761940864,
"acc_stderr": 0.013770739063135374
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/istella22_test_fold2 | ---
pretty_name: '`istella22/test/fold2`'
viewer: false
source_datasets: ['irds/istella22']
task_categories:
- text-retrieval
---
# Dataset Card for `istella22/test/fold2`
The `istella22/test/fold2` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/istella22#istella22/test/fold2).
# Data
This dataset provides:
- `queries` (i.e., topics); count=440
- `qrels`: (relevance assessments); count=2,140
- For `docs`, use [`irds/istella22`](https://huggingface.co/datasets/irds/istella22)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/istella22_test_fold2', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/istella22_test_fold2', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
open-llm-leaderboard/details_R136a1__InfinityLake-2x7B | ---
pretty_name: Evaluation run of R136a1/InfinityLake-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [R136a1/InfinityLake-2x7B](https://huggingface.co/R136a1/InfinityLake-2x7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_R136a1__InfinityLake-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T21:42:42.681302](https://huggingface.co/datasets/open-llm-leaderboard/details_R136a1__InfinityLake-2x7B/blob/main/results_2024-04-15T21-42-42.681302.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6486382462893618,\n\
\ \"acc_stderr\": 0.03218860377821126,\n \"acc_norm\": 0.648769127038304,\n\
\ \"acc_norm_stderr\": 0.03285634124861645,\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6137503471716889,\n\
\ \"mc2_stderr\": 0.015511399292467727\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.01331852846053942\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7005576578370842,\n\
\ \"acc_stderr\": 0.004570777326263903,\n \"acc_norm\": 0.8740290778729337,\n\
\ \"acc_norm_stderr\": 0.003311384498158646\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501555,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501555\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.016421670506339175,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.016421670506339175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6137503471716889,\n\
\ \"mc2_stderr\": 0.015511399292467727\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873523\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6542835481425322,\n \
\ \"acc_stderr\": 0.01310042299044157\n }\n}\n```"
repo_url: https://huggingface.co/R136a1/InfinityLake-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-42.681302.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T21-42-42.681302.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- '**/details_harness|winogrande|5_2024-04-15T21-42-42.681302.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T21-42-42.681302.parquet'
- config_name: results
data_files:
- split: 2024_04_15T21_42_42.681302
path:
- results_2024-04-15T21-42-42.681302.parquet
- split: latest
path:
- results_2024-04-15T21-42-42.681302.parquet
---
# Dataset Card for Evaluation run of R136a1/InfinityLake-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [R136a1/InfinityLake-2x7B](https://huggingface.co/R136a1/InfinityLake-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_R136a1__InfinityLake-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T21:42:42.681302](https://huggingface.co/datasets/open-llm-leaderboard/details_R136a1__InfinityLake-2x7B/blob/main/results_2024-04-15T21-42-42.681302.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6486382462893618,
"acc_stderr": 0.03218860377821126,
"acc_norm": 0.648769127038304,
"acc_norm_stderr": 0.03285634124861645,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6137503471716889,
"mc2_stderr": 0.015511399292467727
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.7056313993174061,
"acc_norm_stderr": 0.01331852846053942
},
"harness|hellaswag|10": {
"acc": 0.7005576578370842,
"acc_stderr": 0.004570777326263903,
"acc_norm": 0.8740290778729337,
"acc_norm_stderr": 0.003311384498158646
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501555,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339175,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6137503471716889,
"mc2_stderr": 0.015511399292467727
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873523
},
"harness|gsm8k|5": {
"acc": 0.6542835481425322,
"acc_stderr": 0.01310042299044157
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Algoroxyolo/kanji-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 13284948.663
num_examples: 6409
download_size: 15598460
dataset_size: 13284948.663
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Smuggling1710/VTnsfw | ---
license: apache-2.0
tags:
- not-for-all-audiences
--- |
JinglesDados/FernandaCrispim | ---
license: openrail
---
|
moyusufff/mini-platypus-two | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ArhamNaeem/fyp-code-gen-dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 24421
num_examples: 101
download_size: 11195
dataset_size: 24421
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jing24/seperate_0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int32
- name: text
sequence: string
splits:
- name: train
num_bytes: 8063353
num_examples: 9208
download_size: 1455012
dataset_size: 8063353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "seperate_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kristinashemet/German_datasets | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 259881583
num_examples: 346965
download_size: 137269817
dataset_size: 259881583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "German_datasets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmg-anon/VNTL-2k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 87890178
num_examples: 16887
download_size: 0
dataset_size: 87890178
---
# Dataset Card for "VNTL-2k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
StephanAkkerman/crypto-stock-tweets | ---
license: cc-by-4.0
language:
- en
tags:
- finance
- twitter
- tweets
- crypto
- stocks
pretty_name: Crypto & Stock Tweets
size_categories:
- 1M<n<10M
---
# Crypto & Stock Tweets
## Overview
This dataset is a combination of publically available financial tweets.
## Datset Size
- Stock Tweets: 2,624,314
- Crypto Tweets: 5,748,725
- Bitcoin Tweets: 4,820,915
## Sources
This dataset is a combination of data from various reputable sources, each contributing a unique perspective on financial tweets:
- [Stock Market Tweets Data](https://ieee-dataport.org/open-access/stock-market-tweets-data): 923,673 rows of stock tweets
- [Stock Market Tweets](https://huggingface.co/datasets/mjw/stock_market_tweets): 1,700,641 rows of stock tweets
- [Crypto Tweets](https://www.kaggle.com/datasets/leoth9/crypto-tweets): 10,438 rows of cryptocurrency tweets
- [Influencers' Tweets In Cryptocurrency](https://data.mendeley.com/datasets/8fbdhh72gs/5): 16,512 rows of cryptocurrency tweets
- [Bitcoin Tweets](https://data.mendeley.com/datasets/x7yvshrnxy/1): 76,797 of bitcoin tweets
- [Bitcoin Tweets](https://www.kaggle.com/datasets/kaushiksuresh147/bitcoin-tweets): 4,863,751 rows of bitcoin tweets
- [Crypto Tweets](https://www.kaggle.com/datasets/tleonel/crypto-tweets-80k-in-eng-aug-2022): 80,000 rows of cryptocurrency tweets
- [Cryptocurreny Sentiment Tweets](https://www.kaggle.com/datasets/rezasemyari/crypto-sentiment-tweets): 824,908 rows of cryptocurrency tweets
- [Financial Tweets](https://huggingface.co/datasets/StephanAkkerman/financial-tweets): 263,119 rows of financial tweets
- [Cryptocurrency Tweets](https://github.com/am15h/CrypTop12): 576,836 rows of cryptocurrency tweets
## Usage
This dataset can be used for pre-training language models on financial tweets.
## Pre-processing Steps
Originally the combined datasets consist of 9,336,675 rows. However, there are some duplicates and not useful tweets in there. The dataset has been cleaned of `t.co` URLs, duplicate text, empty text, and tweets that end with '...'.
As a result the cleaned dataset consist of 8,024,269 rows, which is the one available here.
## Acknowledgements
We extend our heartfelt gratitude to all the authors and contributors of the original datasets. Their efforts in data collection and curation have been pivotal in creating this comprehensive resource.
## License
This dataset is made available under the CC-BY-4.0 license, adhering to the licensing terms of the original datasets. |
EricWiener/llm4html-descgen | ---
task_categories:
- text-classification
language:
- en
tags:
- code
---
The dataset comes from [the original paper upload](https://console.cloud.google.com/storage/browser/gresearch/webllm/datasets/descgen) which was uploaded in a RecordIO format.
See the original paper [Understanding HTML with Large Language Models](https://arxiv.org/abs/2210.03945) for more details. |
twodgirl/baize-quora | ---
license: gpl-3.0
language:
- en
tags:
- quora
---
[Baize](https://github.com/project-baize/baize-chatbot) has scrapped questions from Quora. The dialogs are generated by letting ChatGPT chat with itself.
This dataset is in alpaca format. |
LeonardoTiger/wattson | ---
license: openrail
---
|
CyberHarem/kaede_lapisrelights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kaede (Lapis Re:LiGHTs)
This is the dataset of Kaede (Lapis Re:LiGHTs), containing 66 images and their tags.
The core tags of this character are `hair_ornament, green_eyes, black_hair, bangs, side_ponytail, leaf_hair_ornament, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 66 | 39.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaede_lapisrelights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 66 | 33.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaede_lapisrelights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 137 | 62.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaede_lapisrelights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 66 | 39.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaede_lapisrelights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 137 | 73.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kaede_lapisrelights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kaede_lapisrelights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, blush, school_uniform, maple_leaf, short_sleeves, sidelocks |
| 1 | 8 |  |  |  |  |  | 1girl, solo, upper_body, school_uniform, closed_mouth, collarbone, frills, maple_leaf, outdoors, sailor_collar, looking_at_viewer, puffy_short_sleeves, smile, white_shirt |
| 2 | 6 |  |  |  |  |  | 1girl, long_hair, solo, wide_sleeves, black_skirt, long_sleeves, standing, green_kimono, smile, bare_shoulders, black_footwear, detached_sleeves, flower, folding_fan, frilled_sleeves, full_body, holding_fan, obi, open_mouth, outdoors |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | school_uniform | maple_leaf | short_sleeves | sidelocks | upper_body | closed_mouth | collarbone | frills | outdoors | sailor_collar | looking_at_viewer | puffy_short_sleeves | smile | white_shirt | long_hair | wide_sleeves | black_skirt | long_sleeves | standing | green_kimono | bare_shoulders | black_footwear | detached_sleeves | flower | folding_fan | frilled_sleeves | full_body | holding_fan | obi | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------------|:-------------|:----------------|:------------|:-------------|:---------------|:-------------|:---------|:-----------|:----------------|:--------------------|:----------------------|:--------|:--------------|:------------|:---------------|:--------------|:---------------|:-----------|:---------------|:-----------------|:-----------------|:-------------------|:---------|:--------------|:------------------|:------------|:--------------|:------|:-------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | | | | | | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_13 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1187036856.0
num_examples: 233118
download_size: 1203444722
dataset_size: 1187036856.0
---
# Dataset Card for "chunk_13"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DopeorNope/new_instruct1 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 401752312
num_examples: 98293
download_size: 198509322
dataset_size: 401752312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EgilKarlsen/AA_ApacheDistilRoBERTa_Finetuned | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318780.21618997
num_examples: 26057
- name: test
num_bytes: 26774087.073587257
num_examples: 8686
download_size: 147168121
dataset_size: 107092867.28977722
---
# Dataset Card for "AA_ApacheDistilRoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vidhikatkoria/SGD_RideSharing | ---
dataset_info:
features:
- name: domain
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: act
dtype: int64
- name: speaker
dtype: int64
splits:
- name: train
num_bytes: 658561.1466613673
num_examples: 2515
- name: test
num_bytes: 188
num_examples: 1
download_size: 242358
dataset_size: 658749.1466613673
---
# Dataset Card for "SGD_RideSharing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hadikhamoud/test | ---
license: openrail
---
|
firqaaa/emotion-bahasa | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.