datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jmichaelov/inverse_scaling_prize-hindsight_neglect | ---
license: cc-by-4.0
task_categories:
- multiple-choice
language:
- en
pretty_name: Hindsight Neglect
---
The hindsight-neglect task from the [Inverse Scaling Prize](https://github.com/inverse-scaling/prize) |
kaleemWaheed/twitter_dataset_1713172285 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 28392
num_examples: 66
download_size: 15255
dataset_size: 28392
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dustinwloring1988/openhermes | ---
dataset_info:
features:
- name: output
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 307373346
num_examples: 242831
download_size: 143664644
dataset_size: 307373346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mvasiliniuc/iva-swift-codeint-clean-train | ---
annotations_creators:
- crowdsourced
license: other
language_creators:
- crowdsourced
language:
- code
task_categories:
- text-generation
tags:
- code, swift, native iOS development, curated, training
size_categories:
- 10K<n<100K
source_datasets: []
pretty_name: iva-swift-codeint-clean
task_ids:
- language-modeling
---
# IVA Swift GitHub Code Dataset
## Dataset Description
This is the curated train split of IVA Swift dataset extracted from GitHub.
It contains curated Swift files gathered with the purpose to train a code generation model.
The dataset consists of 320000 Swift code files from GitHub.
[Here is the unsliced curated dataset](https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean) and
[here is the raw dataset](https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint).
### How to use it
To download the full dataset:
```python
from datasets import load_dataset
dataset = load_dataset('mvasiliniuc/iva-swift-codeint-clean', split='train')
```
## Data Structure
### Data Fields
|Field|Type|Description|
|---|---|---|
|repo_name|string|name of the GitHub repository|
|path|string|path of the file in GitHub repository|
|copies|string|number of occurrences in dataset|
|content|string|content of source file|
|size|string|size of the source file in bytes|
|license|string|license of GitHub repository|
|hash|string|Hash of content field.|
|line_mean|number|Mean line length of the content.
|line_max|number|Max line length of the content.
|alpha_frac|number|Fraction between mean and max line length of content.
|ratio|number|Character/token ratio of the file with tokenizer.
|autogenerated|boolean|True if the content is autogenerated by looking for keywords in the first few lines of the file.
|config_or_test|boolean|True if the content is a configuration file or a unit test.
|has_no_keywords|boolean|True if a file has none of the keywords for Swift Programming Language.
|has_few_assignments|boolean|True if file uses symbol '=' less than `minimum` times.
### Instance
```json
{
"repo_name":"...",
"path":".../BorderedButton.swift",
"copies":"2",
"size":"2649",
"content":"...",
"license":"mit",
"hash":"db1587fd117e9a835f58cf8203d8bf05",
"line_mean":29.1136363636,
"line_max":87,
"alpha_frac":0.6700641752,
"ratio":5.298,
"autogenerated":false,
"config_or_test":false,
"has_no_keywords":false,
"has_few_assignments":false
}
```
## Languages
The dataset contains only Swift files.
```json
{
"Swift": [".swift"]
}
```
## Licenses
Each entry in the dataset contains the associated license. The following is a list of licenses involved and their occurrences.
```json
{
"agpl-3.0":1415,
"apache-2.0":71451,
"artistic-2.0":169,
"bsd-2-clause":2628,
"bsd-3-clause":5492,
"cc0-1.0":1176,
"epl-1.0":498,
"gpl-2.0":7846,
"gpl-3.0":15716,
"isc":676,
"lgpl-2.1":932,
"lgpl-3.0":2553,
"mit":201134,
"mpl-2.0":6846,
"unlicense":1468
}
```
## Dataset Statistics
```json
{
"Total size": "~453 MB",
"Number of files": 320000,
"Number of files under 500 bytes": 3116,
"Average file size in bytes": 5940,
}
```
## Curation Process
See [the unsliced curated dataset](https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean) for mode details.
## Data Splits
The dataset only contains a train split focused only on training data. For validation and unspliced versions, please check the following links:
* Clean Version Unsliced: https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean
* Clean Version Valid: https://huggingface.co/datasets/mvasiliniuc/iva-swift-codeint-clean-valid
# Considerations for Using the Data
The dataset comprises source code from various repositories, potentially containing harmful or biased code,
along with sensitive information such as passwords or usernames.
|
Francesco/activity-diagrams-qdobr | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': activity-diagrams
'1': action
'2': activity
'3': commeent
'4': control_flow
'5': control_flowcontrol_flow
'6': decision_node
'7': exit_node
'8': final_flow_node
'9': final_node
'10': fork
'11': merge
'12': merge_noode
'13': null
'14': object
'15': object_flow
'16': signal_recept
'17': signal_send
'18': start_node
'19': text
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: activity-diagrams-qdobr
tags:
- rf100
---
# Dataset Card for activity-diagrams-qdobr
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/activity-diagrams-qdobr
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
activity-diagrams-qdobr
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/activity-diagrams-qdobr
### Citation Information
```
@misc{ activity-diagrams-qdobr,
title = { activity diagrams qdobr Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/activity-diagrams-qdobr } },
url = { https://universe.roboflow.com/object-detection/activity-diagrams-qdobr },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
quocanh34/test_result_large_synthesis_data_ver2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 208073
num_examples: 1299
download_size: 109270
dataset_size: 208073
---
# Dataset Card for "test_result_large_synthesis_data_ver2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UnderstandLing/oasst1_fr | ---
license: apache-2.0
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 93639983
num_examples: 84424
- name: validation
num_bytes: 4907489
num_examples: 4400
download_size: 33824444
dataset_size: 98547472
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Ahmed007/test001 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 16610581.0
num_examples: 108
download_size: 15605780
dataset_size: 16610581.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test001"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
intenzadigital/pablomarcal | ---
license: openrail
---
|
CyberHarem/elaina_majonotabitabi | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Elaina
This is the dataset of Elaina, containing 486 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 486 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 1107 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 1341 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 486 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 486 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 486 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 1107 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 1107 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 1011 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 1341 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 1341 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
SALT-NLP/FLUE-FiQA | ---
license: cc-by-3.0
---
## Dataset Summary
- **Homepage:** https://sites.google.com/view/salt-nlp-flang
- **Models:** https://huggingface.co/SALT-NLP/FLANG-BERT
- **Repository:** https://github.com/SALT-NLP/FLANG
## FLUE
FLUE (Financial Language Understanding Evaluation) is a comprehensive and heterogeneous benchmark that has been built from 5 diverse financial domain specific datasets.
Sentiment Classification: [Financial PhraseBank](https://huggingface.co/datasets/financial_phrasebank)\
Sentiment Analysis, Question Answering: [FiQA 2018](https://huggingface.co/datasets/SALT-NLP/FLUE-FiQA)\
New Headlines Classification: [Headlines](https://www.kaggle.com/datasets/daittan/gold-commodity-news-and-dimensions)\
Named Entity Recognition: [NER](https://huggingface.co/datasets/SALT-NLP/FLUE-NER)\
Structure Boundary Detection: [FinSBD3](https://sites.google.com/nlg.csie.ntu.edu.tw/finweb2021/shared-task-finsbd-3)
## Dataset Structure
The FiQA dataset has a corpus, queries and qrels (relevance judgments file). They are in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
|
Iker/InstructTranslation-EN-ES-Raw | ---
language:
- en
- es
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- translation
- text-generation
- text2text-generation
dataset_info:
features:
- name: example_no
dtype: int64
- name: conversation_no
dtype: int64
- name: from
dtype: string
- name: conversation
dtype: string
- name: translation
dtype: string
splits:
- name: train
num_bytes: 1942006
num_examples: 1459
download_size: 1038931
dataset_size: 1942006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Translation of Instructions EN-ES
This dataset contains prompts and answers from [teknium/OpenHermes-2.5](teknium/OpenHermes-2.5) translated to Spanish using GPT-4-0125-preview. The dataset is intended to be used for training a model to translate instructions from English to Spanish.
The dataset contains a `conversation` field in with the English instruction/answer and `translation` with the translated text. `example_no` and `conversation_no` corresponds to the original example id and conversation number in OpenHermes-2.5. Prompts are labeled as `human` and answers are labeled as `gpt`.
A dataset already formated for instruction tunning is available here: https://huggingface.co/datasets/Iker/InstructTranslation-EN-ES-Raw |
deepapaikar/Llama_SC_pairs | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1976153
num_examples: 5346
download_size: 858001
dataset_size: 1976153
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mozilla-foundation/common_voice_12_0 | ---
pretty_name: Common Voice Corpus 12.0
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ab
- ar
- as
- ast
- az
- ba
- bas
- be
- bg
- bn
- br
- ca
- ckb
- cnh
- cs
- cv
- cy
- da
- de
- dv
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- gl
- gn
- ha
- hi
- hsb
- hu
- ia
- id
- ig
- it
- ja
- ka
- kab
- kk
- kmr
- ko
- ky
- lg
- lt
- lv
- mdf
- mhr
- mk
- ml
- mn
- mr
- mrj
- mt
- myv
- nl
- oc
- or
- pl
- pt
- quy
- ro
- ru
- rw
- sah
- sat
- sc
- sk
- skr
- sl
- sr
- sw
- ta
- th
- ti
- tig
- tok
- tr
- tt
- tw
- ug
- uk
- ur
- uz
- vi
- vot
- yo
- yue
- rm
- zh
- sv
- pa
- nn
- ne
- nan
- hy
- ga
- fy
language_bcp47:
- fy-NL
- ga-IE
- hy-AM
- nan-tw
- ne-NP
- nn-NO
- pa-IN
- rm-sursilv
- rm-vallader
- sv-SE
- zh-CN
- zh-HK
- zh-TW
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
- 10M<n<100M
source_datasets:
- extended|common_voice
task_categories:
- automatic-speech-recognition
paperswithcode_id: common-voice
extra_gated_prompt: "By clicking on “Access repository” below, you also agree to not attempt to determine the identity of speakers in the Common Voice dataset."
---
# Dataset Card for Common Voice Corpus 12.0
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Vaibhav Srivastav](mailto:vaibhav@huggingface.co)
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 26119 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 17127 validated hours in 104 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Supported Tasks and Leaderboards
The results for models trained on the Common Voice datasets are available via the
[🤗 Autoevaluate Leaderboard](https://huggingface.co/spaces/autoevaluate/leaderboards?dataset=mozilla-foundation%2Fcommon_voice_11_0&only_verified=0&task=automatic-speech-recognition&config=ar&split=test&metric=wer)
### Languages
```
Abkhaz, Arabic, Armenian, Assamese, Asturian, Azerbaijani, Basaa, Bashkir, Basque, Belarusian, Bengali, Breton, Bulgarian, Cantonese, Catalan, Central Kurdish, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Danish, Dhivehi, Dutch, English, Erzya, Esperanto, Estonian, Finnish, French, Frisian, Galician, Georgian, German, Greek, Guarani, Hakha Chin, Hausa, Hill Mari, Hindi, Hungarian, Igbo, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kazakh, Kinyarwanda, Korean, Kurmanji Kurdish, Kyrgyz, Latvian, Lithuanian, Luganda, Macedonian, Malayalam, Maltese, Marathi, Meadow Mari, Moksha, Mongolian, Nepali, Norwegian Nynorsk, Occitan, Odia, Persian, Polish, Portuguese, Punjabi, Quechua Chanka, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Santali (Ol Chiki), Saraiki, Sardinian, Serbian, Slovak, Slovenian, Sorbian, Upper, Spanish, Swahili, Swedish, Taiwanese (Minnan), Tamil, Tatar, Thai, Tigre, Tigrinya, Toki Pona, Turkish, Twi, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Votic, Welsh, Yoruba
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
```python
from datasets import load_dataset
cv_12 = load_dataset("mozilla-foundation/common_voice_12_0", "hi", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_12 = load_dataset("mozilla-foundation/common_voice_12_0", "hi", split="train", streaming=True)
print(next(iter(cv_12)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_12 = load_dataset("mozilla-foundation/common_voice_12_0", "hi", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_12), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_12, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_12 = load_dataset("mozilla-foundation/common_voice_12_0", "hi", split="train")
dataloader = DataLoader(cv_12, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 12 with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_12_0", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_197 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 897006720.0
num_examples: 176160
download_size: 916279975
dataset_size: 897006720.0
---
# Dataset Card for "chunk_197"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangezheng/SWSR-SexComment | ---
dataset_info:
features:
- name: text_cn
dtype: string
- name: label_sexist
dtype:
class_label:
names:
'0': not sexist
'1': sexist
- name: category
dtype: string
- name: target
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2486780.276163808
num_examples: 6941
- name: validation
num_bytes: 276587.5771788589
num_examples: 772
- name: test
num_bytes: 307399.1466573329
num_examples: 858
download_size: 2021156
dataset_size: 3070767.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
CyberHarem/koganeikoyuzu_edomaeelf | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of こがねいこゆず
This is the dataset of こがねいこゆず, containing 80 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 80 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 182 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 80 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 80 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 80 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 80 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 80 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 182 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 182 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 182 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
airedefined/dolly-rag-gpt-3.5-turbo | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: category
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 7149817.225654802
num_examples: 4367
- name: test
num_bytes: 163723.77434519812
num_examples: 100
download_size: 4485701
dataset_size: 7313541.0
---
# Dataset Card for "dolly-rag-gpt-3.5-turbo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
annyorange/colorized-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: colorized_image
dtype: image
splits:
- name: train
num_bytes: 32465878.0
num_examples: 711
download_size: 32520629
dataset_size: 32465878.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
manu/french_5p | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 28720999458
num_examples: 44392012
- name: test
num_bytes: 50741966
num_examples: 4035
download_size: 15225609944
dataset_size: 28771741424
---
# Dataset Card for "french_5p"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/matou_sakura_fatestaynightufotable | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Matou Sakura (Fate Stay Night [UFOTABLE])
This is the dataset of Matou Sakura (Fate Stay Night [UFOTABLE]), containing 163 images and their tags.
The core tags of this character are `purple_hair, long_hair, ribbon, hair_ribbon, purple_eyes, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 163 | 152.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 163 | 152.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 331 | 274.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matou_sakura_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matou_sakura_fatestaynightufotable',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, homurahara_academy_school_uniform, solo, pink_apron |
| 1 | 7 |  |  |  |  |  | 1girl, solo, homurahara_academy_school_uniform, jacket, looking_at_viewer, empty_eyes, upper_body |
| 2 | 6 |  |  |  |  |  | 1girl, brown_vest, homurahara_academy_school_uniform, neck_ribbon, solo, white_shirt, collared_shirt, long_sleeves, rain, upper_body, wet_clothes, wet_hair |
| 3 | 8 |  |  |  |  |  | 1girl, anime_coloring, portrait, solo, looking_at_viewer, empty_eyes, indoors, blurry, open_mouth, parody |
| 4 | 7 |  |  |  |  |  | 1girl, solo, bowl, cardigan, food, official_alternate_costume, dress, upper_body, holding_chopsticks, long_sleeves |
| 5 | 6 |  |  |  |  |  | 1girl, collarbone, official_alternate_costume, solo, white_dress, elbow_gloves, necklace, open_mouth, puffy_short_sleeves, white_gloves, cleavage, upper_body, medium_breasts, outdoors, pink_dress, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | homurahara_academy_school_uniform | solo | pink_apron | jacket | looking_at_viewer | empty_eyes | upper_body | brown_vest | neck_ribbon | white_shirt | collared_shirt | long_sleeves | rain | wet_clothes | wet_hair | anime_coloring | portrait | indoors | blurry | open_mouth | parody | bowl | cardigan | food | official_alternate_costume | dress | holding_chopsticks | collarbone | white_dress | elbow_gloves | necklace | puffy_short_sleeves | white_gloves | cleavage | medium_breasts | outdoors | pink_dress | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------------------------|:-------|:-------------|:---------|:--------------------|:-------------|:-------------|:-------------|:--------------|:--------------|:-----------------|:---------------|:-------|:--------------|:-----------|:-----------------|:-----------|:----------|:---------|:-------------|:---------|:-------|:-----------|:-------|:-----------------------------|:--------|:---------------------|:-------------|:--------------|:---------------|:-----------|:----------------------|:---------------|:-----------|:-----------------|:-----------|:-------------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | | | X | X | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | | | | | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | | | | X | | | | | | | | | | | | | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X |
|
jborden13/kb-jsonl | ---
dataset_info:
features:
- name: productid
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: add_to_cart_url
dtype: string
- name: categories
dtype: string
- name: retailer
dtype: string
- name: total_cannabinoids
dtype: string
- name: total_terpenes
dtype: string
- name: total_cbd
dtype: string
- name: total_thc
dtype: string
- name: lab_results
dtype: string
- name: prices
dtype: string
- name: description
dtype: string
- name: effects
dtype: string
- name: added
dtype: string
splits:
- name: train
num_bytes: 16657
num_examples: 13
download_size: 22616
dataset_size: 16657
---
# Dataset Card for "kb-jsonl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_team-lucid__mptk-1b | ---
pretty_name: Evaluation run of team-lucid/mptk-1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [team-lucid/mptk-1b](https://huggingface.co/team-lucid/mptk-1b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_team-lucid__mptk-1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-05T14:56:57.175080](https://huggingface.co/datasets/open-llm-leaderboard/details_team-lucid__mptk-1b/blob/main/results_2024-03-05T14-56-57.175080.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2688031819287981,\n\
\ \"acc_stderr\": 0.031342368045241306,\n \"acc_norm\": 0.270436370472431,\n\
\ \"acc_norm_stderr\": 0.03213806602544014,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.39709801636779535,\n\
\ \"mc2_stderr\": 0.014808433008259212\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20648464163822525,\n \"acc_stderr\": 0.011828865619002316,\n\
\ \"acc_norm\": 0.24061433447098976,\n \"acc_norm_stderr\": 0.012491468532390566\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.31248755228042224,\n\
\ \"acc_stderr\": 0.004625600916774991,\n \"acc_norm\": 0.35610436168094006,\n\
\ \"acc_norm_stderr\": 0.004778679507786489\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.035025531706783145,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.035025531706783145\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03459777606810537,\n\
\ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03459777606810537\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.027678452578212387,\n\
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.027678452578212387\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03333333333333329,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03333333333333329\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.02241804289111394,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.02241804289111394\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n\
\ \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208837,\n\
\ \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208837\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.270935960591133,\n \"acc_stderr\": 0.03127090713297697,\n \"acc_norm\"\
: 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297697\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.29797979797979796,\n \"acc_stderr\": 0.03258630383836556,\n \"\
acc_norm\": 0.29797979797979796,\n \"acc_norm_stderr\": 0.03258630383836556\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.034697137917043715,\n\
\ \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.034697137917043715\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602354,\n \
\ \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3302752293577982,\n \"acc_stderr\": 0.02016446633634298,\n \"\
acc_norm\": 0.3302752293577982,\n \"acc_norm_stderr\": 0.02016446633634298\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24050632911392406,\n \"acc_stderr\": 0.027820781981149675,\n \
\ \"acc_norm\": 0.24050632911392406,\n \"acc_norm_stderr\": 0.027820781981149675\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13004484304932734,\n\
\ \"acc_stderr\": 0.022574519424174877,\n \"acc_norm\": 0.13004484304932734,\n\
\ \"acc_norm_stderr\": 0.022574519424174877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082394,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082394\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.1901840490797546,\n \"acc_stderr\": 0.030833491146281245,\n\
\ \"acc_norm\": 0.1901840490797546,\n \"acc_norm_stderr\": 0.030833491146281245\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623101,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623101\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.1979565772669221,\n\
\ \"acc_stderr\": 0.01424887354921758,\n \"acc_norm\": 0.1979565772669221,\n\
\ \"acc_norm_stderr\": 0.01424887354921758\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.01487425216809527,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.01487425216809527\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.02451387997362197,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.02451387997362197\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290382,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290382\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478949,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478949\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27346938775510204,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.27346938775510204,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.03550920185689629,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.03550920185689629\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757447,\n \"mc2\": 0.39709801636779535,\n\
\ \"mc2_stderr\": 0.014808433008259212\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.0140492945362904\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \
\ \"acc_stderr\": 0.0025049422268605143\n }\n}\n```"
repo_url: https://huggingface.co/team-lucid/mptk-1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|arc:challenge|25_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|arc:challenge|25_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|arc:challenge|25_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T09_56_21.993036
path:
- '**/details_harness|drop|3_2023-10-24T09-56-21.993036.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T09-56-21.993036.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T09_56_21.993036
path:
- '**/details_harness|gsm8k|5_2023-10-24T09-56-21.993036.parquet'
- split: 2023_12_03T14_33_48.115586
path:
- '**/details_harness|gsm8k|5_2023-12-03T14-33-48.115586.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|gsm8k|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|gsm8k|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hellaswag|10_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hellaswag|10_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hellaswag|10_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T22-36-48.355605.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T11-37-11.737791.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T14-56-57.175080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T22-36-48.355605.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T14-56-57.175080.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T09_56_21.993036
path:
- '**/details_harness|winogrande|5_2023-10-24T09-56-21.993036.parquet'
- split: 2024_03_05T11_37_11.737791
path:
- '**/details_harness|winogrande|5_2024-03-05T11-37-11.737791.parquet'
- split: 2024_03_05T14_56_57.175080
path:
- '**/details_harness|winogrande|5_2024-03-05T14-56-57.175080.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-05T14-56-57.175080.parquet'
- config_name: results
data_files:
- split: 2023_09_14T22_36_48.355605
path:
- results_2023-09-14T22-36-48.355605.parquet
- split: 2023_10_24T09_56_21.993036
path:
- results_2023-10-24T09-56-21.993036.parquet
- split: 2023_12_03T14_33_48.115586
path:
- results_2023-12-03T14-33-48.115586.parquet
- split: 2024_03_05T11_37_11.737791
path:
- results_2024-03-05T11-37-11.737791.parquet
- split: 2024_03_05T14_56_57.175080
path:
- results_2024-03-05T14-56-57.175080.parquet
- split: latest
path:
- results_2024-03-05T14-56-57.175080.parquet
---
# Dataset Card for Evaluation run of team-lucid/mptk-1b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [team-lucid/mptk-1b](https://huggingface.co/team-lucid/mptk-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_team-lucid__mptk-1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-05T14:56:57.175080](https://huggingface.co/datasets/open-llm-leaderboard/details_team-lucid__mptk-1b/blob/main/results_2024-03-05T14-56-57.175080.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2688031819287981,
"acc_stderr": 0.031342368045241306,
"acc_norm": 0.270436370472431,
"acc_norm_stderr": 0.03213806602544014,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.39709801636779535,
"mc2_stderr": 0.014808433008259212
},
"harness|arc:challenge|25": {
"acc": 0.20648464163822525,
"acc_stderr": 0.011828865619002316,
"acc_norm": 0.24061433447098976,
"acc_norm_stderr": 0.012491468532390566
},
"harness|hellaswag|10": {
"acc": 0.31248755228042224,
"acc_stderr": 0.004625600916774991,
"acc_norm": 0.35610436168094006,
"acc_norm_stderr": 0.004778679507786489
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.035025531706783145,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.035025531706783145
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03459777606810537,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03459777606810537
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.027678452578212387,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.027678452578212387
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.03333333333333329,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03333333333333329
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.02241804289111394,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.02241804289111394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297697,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297697
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29797979797979796,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.29797979797979796,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.034697137917043715,
"acc_norm": 0.3626943005181347,
"acc_norm_stderr": 0.034697137917043715
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3302752293577982,
"acc_stderr": 0.02016446633634298,
"acc_norm": 0.3302752293577982,
"acc_norm_stderr": 0.02016446633634298
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24050632911392406,
"acc_stderr": 0.027820781981149675,
"acc_norm": 0.24050632911392406,
"acc_norm_stderr": 0.027820781981149675
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13004484304932734,
"acc_stderr": 0.022574519424174877,
"acc_norm": 0.13004484304932734,
"acc_norm_stderr": 0.022574519424174877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082394,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082394
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.1901840490797546,
"acc_stderr": 0.030833491146281245,
"acc_norm": 0.1901840490797546,
"acc_norm_stderr": 0.030833491146281245
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623101,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623101
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.1979565772669221,
"acc_stderr": 0.01424887354921758,
"acc_norm": 0.1979565772669221,
"acc_norm_stderr": 0.01424887354921758
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.01487425216809527,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.01487425216809527
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.02451387997362197,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.02451387997362197
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290382,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290382
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478949,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478949
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27346938775510204,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.27346938775510204,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.03550920185689629,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.03550920185689629
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757447,
"mc2": 0.39709801636779535,
"mc2_stderr": 0.014808433008259212
},
"harness|winogrande|5": {
"acc": 0.510655090765588,
"acc_stderr": 0.0140492945362904
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.0025049422268605143
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
THUDM/CogVLM-SFT-311K | ---
license: cc-by-nc-4.0
---
# CogVLM-SFT-311K: Bilingual Visual Instruction Data in CogVLM SFT
CogVLM-SFT-311K is the primary aligned corpus used in the initial training of CogVLM v1.0. The process of constructing this dataset is as follows:
1. Approximately 3500 high-quality data samples were selected from the open source [MiniGPT-4](https://huggingface.co/datasets/Vision-CAIR/cc_sbu_align), known as minigpt4-3500.
2. Minigpt4-3500 was integrated with [Llava-Instruct-150K](https://huggingface.co/datasets/liuhaotian/LLaVA-Instruct-150K) and translated into Chinese through a language model.
3. We discovered significant noise in the detailed description part of minigpt4-3500 and Llava-instruct. Thus, we corrected these Chinese corpora and retranslated them into English.
## Dataset Information
The dataset contains three folders corresponding to the mixed part of minigpt4-3500 and llava, the llava solo conversation, and the multi-turn conversation datasets. Their layout is as follows:
```
.CogVLM-SFT-311K
├── llava_details-minigpt4_3500_formate
├── llava_instruction_multi_conversations_formate
└── llava_instruction_single_conversation_formate
```
In our open-source data, the datasets are distributed as follows:
```
.llava_details-minigpt4_3500_formate
├── images
│ └── 00000001.jpg
└── labels
└── 00000001.json
```
Images are stored in the images folder, while labels containing the corresponding image description or dialogue are stored in the labels folder.
## Dataset Quantity
+ llava_details-minigpt4_3500_formate: 22,464 images and descriptions
+ llava_instruction_multi_conversations_formate: 56,673 images and multi-turn conversations
+ llava_instruction_single_conversation_formate: 76,634 images and single-turn conversations
Dataset Format
Caption format for image description
```
{
"captions": [
{
"role": "caption",
"content": "The photograph features a beach scene with a group of people surfing in the ocean. There are ten individuals spaced out in the water, sitting or lying on various sizes of body boards. The surfers line up in a row, close to the sandy shoreline. Some of the body boards are visible floating on the surface of the water, with the surfers riding on them or nearby."
}
]
}
```
Conversation format for image dialogue
```
{
"conversations": [
{
"role": "user",
"content": "What can be inferred about the zebras' behavior and surroundings?"
},
{
"role": "assistant",
"content": "Based on the image, we can infer that the two zebras are likely seeking relief from the sun's heat, as they are standing side by side under the branches of a thorny tree. This shade-providing tree offers some respite from the sun, possibly during the hottest part of the day. The zebras are in a green field with grass, providing them with an ideal environment to graze and eat while staying near their source of shelter. This shows that the zebras' behavior is influenced by the conditions and available resources in their surroundings. It also highlights that these animals adopt strategies to adapt to the fluctuating conditions of their environment, such as cooperation and seeking shelter, to survive and thrive in their natural habitat."
}
]
}
```
## License
+ Due to non-commercial agreements, we did not use these data in the bilingual version of CogVLM or any other models involving commercialization.
+ The dataset license adheres to: <br> Attribution-NonCommercial 4.0 International. It should abide by the policy of OpenAI: https://openai.com/policies/terms-of-use
This will not allow you to use these data for any **commercial activitiesI**.
## References
This project utilizes data and concepts based on the following research papers:
- Zhu, D., Chen, J., Shen, X., Li, X., & Elhoseiny, M. (2023). MiniGPT-4: Enhancing Vision-Language Understanding with Advanced Large Language Models. arXiv preprint arXiv:2304.10592.
- Liu, H., Li, C., Wu, Q., & Lee, Y. J. (2023). Visual Instruction Tuning. arXiv:2304.08485.
|
hi-bumblebee/optimus-embeddings-ada-002 | ---
license: apache-2.0
---
|
ai-maker-space/medical_nonmedical | ---
dataset_info:
features:
- name: is_medical
dtype: int64
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 25910847
num_examples: 14202
download_size: 16144838
dataset_size: 25910847
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "medical_nonmedical"
This dataset is a combination of [`20newsgroups`](https://scikit-learn.org/0.19/datasets/twenty_newsgroups.html) and the [`Medical Abstracts TC`](https://github.com/sebischair/Medical-Abstracts-TC-Corpus.git) datasets. |
projecte-aina/InToxiCat | ---
YAML tags: null
annotations_creators:
- expert-generated
language:
- ca
license:
- cc-by-nc-4.0
multilinguality:
- monolingual
pretty_name: InToxiCat
tags:
- abusive-language-detection
- abusive-language
- toxic-language-detection
- toxicity-detection
task_categories:
- text-classification
- token-classification
---
# Dataset Card for InToxiCat
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Example](#example)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Website:** https://zenodo.org/records/10600606
- **Point of Contact:** langtech@bsc.es
### Dataset Summary
InToxiCat is a dataset for the detection of abusive language (defined by the aim to harm someone, individual, group, etc.) in Catalan, produced by the BSC LangTech unit.
The dataset consists of 29,809 sentences obtained from internet forums annotated as to whether or not they are abusive. The 6047 instances annotated as abusive are further annotated for the following features: abusive span, target span, target type and the implicit or explicit nature of the abusiveness in the message.
The dataset is split, in a balanced abusive/non-abusive distribution, into 23,847 training samples, 2981 validation samples, and 2981 test samples.
### Supported Tasks and Leaderboards
Abusive Language Detection
### Languages
The dataset is in Catalan (`ca-ES`).
## Dataset Structure
### Data Instances
Three JSON files, one for each split.
### Example:
<pre>
{
"id": "9472844_16_0",
"context": "Aquest tiu no té ni puta idea del que és una guerra ni del que s'espera d'un soldat.I què s'empatolla de despeses mèdiques. A veure si li passaré com al Hollande i sortiré la factura del seu perruquer (o taxidermista, no sé)",
"sentence": "Aquest tiu no té ni puta idea del que és una guerra ni del que s'espera d'un soldat.I què s'empatolla de despeses mèdiques.",
"topic": "Internacional",
"key_words": [
"puta"
],
"annotation": {
"is_abusive": "abusive",
"abusiveness_agreement": "full",
"context_needed": "no",
"abusive_spans": [
[
"no té ni puta idea",
"11:29"
]
],
"target_spans": [
[
"Aquest tiu",
"0:10"
]
],
"target_type": [
"INDIVIDUAL"
],
"is_implicit": "yes"
}
}
</pre>
### Data Fields
- ``id`` (a string feature): unique identifier of the instance.
- ``context`` (a string feature): complete text message from the user surrounding the sentence (it can coincide totally or only partially with the sentence).
- ``sentence`` (a string feature): text message where the abusiveness is evaluated.
- ``topic`` (a string feature): category from Racó Català forums where the sentence comes from.
- ``keywords`` (a list of strings): keywords used to select the candidate messages to annotate.
- ``context_needed`` (a string feature): "yes" / "no" if all the annotators consulted / did not consult the context to decide on the sentence's abusiveness, "maybe" if there was not agreement about it.
- ``is_abusive`` (a bool feature): "abusive" or "not_abusive".
- ``abusiveness_agreement`` (a string feature): "full" if the two annotators agreed on the abusiveness/not-abusiveness of the sentence, and "partial" if the abusiveness had to be decided by a third annotator.
- ``abusive_spans`` (a dictionary with field 'text' (list of strings) and 'index' (list of strings)): the sequence of words that attribute to the text's abusiveness.
- ``is_implicit`` (a string): whether the abusiveness is explicit (contains a profanity, slur or threat) or implicit (does not contain a profanity or slur, but is likely to contain irony, sarcasm or similar resources).
- ``target_spans`` (a dictionary with field 'text' (list of strings) and 'index' (list of strings)): if found in the message, the sequence(s) of words that refer to the target of the text's abusiveness.
- ``target_type`` (a dictionary with field 'text' (list of strings) and 'index' (list of strings)): three possible categories. The categories are non-exclusive, as some targets may have a dual identity and more than one target may be detected in a single message.
- ``individual``: a famous person, a named person or an unnamed person interacting in the conversation.
- ``group``: considered to be a unit based on the same ethnicity, gender or sexual orientation, political affiliation, religious belief or something else.
- ``other``; e.g. an organization, a situation, an event, or an issue.
### Data Splits
* train.json: 23847 examples
* dev.json: 2981 examples
* test.json: 2981 examples
## Dataset Creation
### Curation Rationale
We created this dataset to contribute to the development of language models in Catalan, a low-resource language.
### Source Data
#### Initial Data Collection and Normalization
The sentences to be annotated were collected from [Racó Català](https://www.racocatala.cat/forums) forums using a list of keywords (provided in Zenodo). The messages belong to different categories of Racó Català, specified in the "topic" field of the dataset. The length of the messages varies from one sentence to several sentences.
#### Who are the source language producers?
Anonymized users from Racó Català forums.
### Annotations
#### Annotation process
The annotation process was divided into the following two tasks, carried out in sequential order:
Task 1. The sentences (around 30.000) were annotated by two annotators as either abusive or not abusive. In case of ambiguity in the sentence, the annotators had the possibility to consult the context, i.e. the whole message of the user (if the sentence to be annotated was a segment contained in the message). In cases where annotators 1 and 2 disagreed about the abusiveness of a message, it was annotated by a third annotator. As a result, the sentences that are ultimately considered abusive are those that were initially annotated as abusive by both annotators or, in the case of an initial disagreement between them, those that were resolved as abusive by the third annotator.
Task 2. The sentences annotated as abusive (6047) in Task 1 were further annotated by the two main annotators for the following features, explained in the Summary section: abusive spans, implicit/explicit abusiveness, target spans, and target type.
The annotation guidelines are published and available on Zenodo.
#### Who are the annotators?
The annotators were qualified professionals with university education and a demonstrably excellent knowledge of Catalan (minimum level C1 or equivalent).
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this dataset contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Language Technologies Unit at the Barcelona Supercomputing Center (langtech@bsc.es)
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/ca/inici/index.html#googtrans(ca|en) within the framework of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina).
### Licensing Information
This work is licensed under a [Creative Commons Attribution Non-commercial 4.0 International](https://creativecommons.org/licenses/by-nc/4.0/).
### Citation Information
[](https://doi.org/10.57967/hf/1719)
### Contributions
[N/A] |
efederici/lfqa-preprocessed-it | ---
task_categories:
- question-answering
- text-generation
language:
- it
tags:
- lfqa
- mt
--- |
Felladrin/ChatML-hercules-v2.0 | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- en
size_categories:
- 1M<n<10M
---
[Locutusque/hercules-v2.0](https://huggingface.co/datasets/Locutusque/hercules-v2.0) in ChatML format.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
pretrained_model_name_or_path="Felladrin/Llama-160M-Chat-v1"
)
dataset = load_dataset("Locutusque/hercules-v2.0", split="train")
def format(columns):
messages = []
conversation = columns["conversations"]
for i in range(len(conversation)):
message = conversation[i]
content = message["value"]
role = message["from"]
if role == "human":
role = "user"
elif role == "gpt":
role = "assistant"
if role and content:
messages.append(
{
"role": role.strip(),
"content": content.strip(),
}
)
return tokenizer.apply_chat_template(messages, tokenize=False)
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_parquet("train.parquet", index=False)
```
|
EcrOp/RVL-CDIP-documents-donut | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 166476320.0
num_examples: 1600
- name: validation
num_bytes: 15392444.0
num_examples: 160
- name: test
num_bytes: 80367176.0
num_examples: 800
- name: train_800
num_bytes: 83301114.0
num_examples: 800
- name: train_400
num_bytes: 41873661.0
num_examples: 400
- name: train_160
num_bytes: 16726920.0
num_examples: 160
- name: test_2
num_bytes: 15537435.0
num_examples: 160
- name: test_3
num_bytes: 13972614.0
num_examples: 160
- name: test_4
num_bytes: 17178233.0
num_examples: 160
- name: test_5
num_bytes: 17148238.0
num_examples: 160
download_size: 470189693
dataset_size: 467974155.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train_800
path: data/train_800-*
- split: train_400
path: data/train_400-*
- split: train_160
path: data/train_160-*
- split: test_2
path: data/test_2-*
- split: test_3
path: data/test_3-*
- split: test_4
path: data/test_4-*
- split: test_5
path: data/test_5-*
---
|
gabeorlanski/bc-mbpp | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- en
tags:
- code
pretty_name: BabelCode MBPP
size_categories:
- 1K<n<10K
source_datasets:
- original
- extended|mbpp
---
# Dataset Card for BabelCode MBPP
## Dataset Description
- **Repository:** [GitHub Repository](https://github.com/google-research/babelcode)
- **Paper:** [Measuring The Impact Of Programming Language Distribution](https://arxiv.org/abs/2302.01973)
### How To Use This Dataset
To use this dataset, you can either use the original [BabelCode Repo](https://github.com/google-research/babelcode), or you can use the [`bc_eval` Metric](https://huggingface.co/spaces/gabeorlanski/bc_eval).
### Dataset Summary
The BabelCode-MBPP (BC-MBPP) dataset converts the [MBPP dataset released by Google](https://arxiv.org/abs/2108.07732) to 16 programming languages.
### Supported Tasks and Leaderboards
### Languages
BC-MBPP supports:
* C++
* C#
* Dart
* Go
* Haskell
* Java
* Javascript
* Julia
* Kotlin
* Lua
* PHP
* Python
* R
* Rust
* Scala
* TypeScript
## Dataset Structure
```python
>>> from datasets import load_dataset
>>> load_dataset("gabeorlanski/bc-mbpp")
DatasetDict({
train: Dataset({
features: ['qid', 'title', 'language', 'text', 'signature_with_docstring', 'signature', 'arguments', 'solution', 'question_info'],
num_rows: 5308
})
test: Dataset({
features: ['qid', 'title', 'language', 'text', 'signature_with_docstring', 'signature', 'arguments', 'solution', 'question_info'],
num_rows: 6989
})
validation: Dataset({
features: ['qid', 'title', 'language', 'text', 'signature_with_docstring', 'signature', 'arguments', 'solution', 'question_info'],
num_rows: 1216
})
prompt: Dataset({
features: ['qid', 'title', 'language', 'text', 'signature_with_docstring', 'signature', 'arguments', 'solution', 'question_info'],
num_rows: 160
})
})
```
### Data Fields
- `qid`: The question ID used for running tests.
- `title`: The title of the question.
- `language`: The programming language of the example.
- `text`: The description of the problem.
- `signature`: The signature for the problem.
- `signature_with_docstring`: The signature with the adequately formatted docstring for the given problem.
- `arguments`: The arguments of the problem.
- `solution`: The solution in Python.
- `question_info`: The dict of information used for executing predictions. It has the keys:
- `test_code`: The raw testing script used in the language. If you want to use this, replace `PLACEHOLDER_FN_NAME` (and `PLACEHOLDER_CLS_NAME` if needed) with the corresponding entry points. Next, replace `PLACEHOLDER_CODE_BODY` with the postprocessed prediction.
- `test_list`: The raw json line of the list of tests for the problem. To load them, use `json.loads`
- `test_case_ids`: The list of test case ids for the problem. These are used to determine if a prediction passes or not.
- `entry_fn_name`: The function's name to use an entry point.
- `entry_cls_name`: The class name to use an entry point.
- `commands`: The commands used to execute the prediction. Includes a `__FILENAME__` hole that is replaced with the filename.
- `timeouts`: The default timeouts for each command.
- `extension`: The extension for the prediction file.
**NOTE:** If you want to use a different function name (or class name for languages that require class names) for the prediction, you must update the `entry_fn_name` and `entry_cls_name` accordingly. For example, if you have the original question with `entry_fn_name` of `add`, but want to change it to `f`, you must update `ds["question_info"]["entry_fn_name"]` to `f`:
```python
>>> from datasets import load_dataset
>>> ds = load_dataset("gabeorlanski/bc-mbpp")['test']
>>> # The original entry_fn_name
>>> ds[0]['question_info']['entry_fn_name']
removeOcc
>>> # You MUST update the corresponding entry_fn_name
>>> ds[0]['question_info']['entry_fn_name'] = 'f'
>>> ds[0]['question_info']['entry_fn_name']
f
```
## Dataset Creation
See section 2 of the [BabelCode Paper](https://arxiv.org/abs/2302.01973) to learn more about how the datasets are translated.
Information on how the original MBPP was curated is located [here](https://huggingface.co/datasets/mbpp).
### Dataset Curators
Google Research
### Licensing Information
CC-BY-4.0
### Citation Information
```
@article{orlanski2023measuring,
title={Measuring The Impact Of Programming Language Distribution},
author={Orlanski, Gabriel and Xiao, Kefan and Garcia, Xavier and Hui, Jeffrey and Howland, Joshua and Malmaud, Jonathan and Austin, Jacob and Singh, Rishah and Catasta, Michele},
journal={arXiv preprint arXiv:2302.01973},
year={2023}
}
@article{Austin2021ProgramSW,
title={Program Synthesis with Large Language Models},
author={Jacob Austin and Augustus Odena and Maxwell Nye and Maarten Bosma and Henryk Michalewski and David Dohan and Ellen Jiang and Carrie J. Cai and Michael Terry and Quoc V. Le and Charles Sutton},
journal={ArXiv},
year={2021},
volume={abs/2108.07732}
}
``` |
carles-undergrad-thesis/msmarco-query-en-id-parallel-sentences | ---
dataset_info:
features:
- name: text_en
dtype: string
- name: text_id
dtype: string
splits:
- name: train
num_bytes: 39060054
num_examples: 509919
download_size: 27839260
dataset_size: 39060054
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "msmarco-query-en-id-parallel-sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
r0ll/mzlff | ---
license: openrail
language:
- ru
---
VOICE https://www.twitch.tv/mazellovvv rvc v2 300 epoch. |
IGNF/FLAIR_1_osm_clip | ---
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: image
dtype: image
- name: segmentation
dtype: image
- name: metadata
struct:
- name: address
struct:
- name: ISO3166-2-lvl4
dtype: string
- name: ISO3166-2-lvl6
dtype: string
- name: ISO3166-2-lvl7
dtype: string
- name: aerialway
dtype: string
- name: aeroway
dtype: string
- name: allotments
dtype: string
- name: amenity
dtype: string
- name: bridge
dtype: string
- name: building
dtype: string
- name: city
dtype: string
- name: city_district
dtype: string
- name: club
dtype: string
- name: commercial
dtype: string
- name: country
dtype: string
- name: country_code
dtype: string
- name: county
dtype: string
- name: craft
dtype: string
- name: emergency
dtype: string
- name: farm
dtype: string
- name: farmyard
dtype: string
- name: hamlet
dtype: string
- name: healthcare
dtype: string
- name: highway
dtype: string
- name: historic
dtype: string
- name: house_number
dtype: string
- name: industrial
dtype: string
- name: isolated_dwelling
dtype: string
- name: junction
dtype: string
- name: landuse
dtype: string
- name: leisure
dtype: string
- name: locality
dtype: string
- name: man_made
dtype: string
- name: military
dtype: string
- name: municipality
dtype: string
- name: natural
dtype: string
- name: neighbourhood
dtype: string
- name: office
dtype: string
- name: place
dtype: string
- name: postcode
dtype: string
- name: province
dtype: string
- name: quarter
dtype: string
- name: railway
dtype: string
- name: region
dtype: string
- name: residential
dtype: string
- name: retail
dtype: string
- name: road
dtype: string
- name: shop
dtype: string
- name: square
dtype: string
- name: state
dtype: string
- name: state_district
dtype: string
- name: suburb
dtype: string
- name: tourism
dtype: string
- name: town
dtype: string
- name: village
dtype: string
- name: boundingbox
sequence: string
- name: camera
dtype: string
- name: code
dtype: string
- name: date
dtype: string
- name: dimensions
sequence: int64
- name: display_name
dtype: string
- name: domain
dtype: string
- name: lat
dtype: string
- name: latlong
sequence: float64
- name: licence
dtype: string
- name: lon
dtype: string
- name: origin
sequence: float64
- name: osm_id
dtype: int64
- name: osm_type
dtype: string
- name: patch_centroid_x
dtype: float64
- name: patch_centroid_y
dtype: float64
- name: patch_centroid_z
dtype: float64
- name: place_id
dtype: int64
- name: time
dtype: string
- name: unit_system
dtype: string
- name: zone
dtype: string
- name: class_rep
struct:
- name: class_repartition
struct:
- name: agricultural land
dtype: float64
- name: bare soil
dtype: float64
- name: brushwood
dtype: float64
- name: building
dtype: float64
- name: clear cut
dtype: float64
- name: coniferous
dtype: float64
- name: deciduous
dtype: float64
- name: greenhouse
dtype: float64
- name: herbaceous vegetation
dtype: float64
- name: impervious surface
dtype: float64
- name: ligneous
dtype: float64
- name: mixed
dtype: float64
- name: other
dtype: float64
- name: pervious surface
dtype: float64
- name: plowed land
dtype: float64
- name: snow
dtype: float64
- name: swimming pool
dtype: float64
- name: vineyard
dtype: float64
- name: water
dtype: float64
- name: prompt
dtype: string
- name: clip_label
dtype: string
splits:
- name: train
num_bytes: 26842989610.0
num_examples: 61712
download_size: 26786210818
dataset_size: 26842989610.0
tags:
- IGN
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FLAIR_OSM_CLIP"
Dataset for the Seg2Sat model: https://github.com/RubenGres/Seg2Sat
Derived from [FLAIR#1](https://huggingface.co/datasets/IGNF/FLAIR) train split.
This dataset incudes the following features:
- **image**: FLAIR#1 .tif files RBG bands converted into a more managable jpg format
- **segmentation**: FLAIR#1 segmentation converted to JPG using the [LUT from the documentation](https://ignf.github.io/FLAIR/index_fr.html)
- **metadata**: OSM metadata for the centroid of the image
- **clip_label**: [CLIP ViT-H](https://github.com/openai/CLIP) description
- **class_rep**: ratio of appearance of each class in the segmentation
- **prompt**: Prompt used for the Seg2Sat model
<!---
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
---> |
jamm55/freePidginDataset | ---
task_categories:
- translation
---
# AutoTrain Dataset for project: improved-pidgin-model
## Dataset Description
This dataset has been automatically processed by AutoTrain for project improved-pidgin-model.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"source": "My people, good evening!",
"target": "My people, good evening o!"
},
{
"source": "Uh... my name is Kabiru Sule.",
"target": "Ehm my name be Kabiru Sule."
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"source": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 8591 |
| valid | 648 |
|
Saxo/jp_ko_translation_tech_social_science_linkbricks_single_dataset_with_prompt_text_huggingface | ---
license: apache-2.0
---
|
fgomeza17/Sammy | ---
license: openrail
---
|
dimanchkek/Deepfacelive-DFM-Models | ---
license: gpl-3.0
---
# Description
<!-- Provide a quick summary of the dataset. -->
Here you can find files for [DeepFaceLab](https://github.com/iperov/DeepFaceLab) and [DeepFaceLive](https://github.com/iperov/DeepFaceLive). All sources and active community members are listed below.
## Disclaimer
<!-- Address questions around how the dataset is intended to be used. -->
The author of this repository makes no claim to the data uploaded here other than that created by himself. Feel free to open a discussion for me to mention your contacts if I haven't done so.
## Risks and Limitations
Use these files at your own risk. The authors of the models and the repository creator cannot be held responsible for your actions.
## Contribution
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
### Discord's servers
- [SHARE YOUR DFM LIVE MODELS](https://discord.gg/HK4rvehcyJ)
- [DFLive sharing](https://discord.gg/uUnzZby6rV)
### Forums
- [mrdeepfakes](https://mrdeepfakes.com/forums/)
- [deepfakevfx](https://www.deepfakevfx.com/)
### Persons worthy of special mention:
- druuziltg (Discord)
- _cxsmo_ (Discord)
- vladpedro (Discord)/@vladpedro (Telegram)
- codyj. (Discord)
- massivebellend (Discord)
- minilady (Discord)
- adrestianempress (Discord)
- rumateus (Discord)
- .grigoletto (Discord)
- ..and your humble servant - aveamd (Discord)
|
Atipico1/mrqa-adv-test-adv | ---
dataset_info:
features:
- name: subset
dtype: string
- name: qid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: masked_query
dtype: string
- name: context
dtype: string
- name: answer_sent
dtype: string
- name: answer_in_context
sequence: string
- name: query_embedding
sequence: float32
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: similar_entity_score
dtype: float32
- name: random_entity
dtype: string
- name: random_entity_score
dtype: float64
- name: rewritten_context
dtype: string
- name: valid
dtype: bool
- name: clear_answer_sent
dtype: string
- name: vague_answer_sent
dtype: string
- name: adversary
dtype: string
- name: replace_count
dtype: int64
splits:
- name: train
num_bytes: 5092076
num_examples: 1000
download_size: 5060641
dataset_size: 5092076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dog/fuego-20230225-025842-d1b8af | ---
tags:
- fuego
fuego:
id: 20230225-025842-d1b8af
status: done
script: run.py
requirements_file: requirements.txt
space_id: dog/actlearn-fuego-runner
space_hardware: cpu-basic
---
|
CyberHarem/yagami_kou_newgame | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yagami Kou
This is the dataset of Yagami Kou, containing 460 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 460 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 1090 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 1287 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 460 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 460 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 460 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 1090 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 1090 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 975 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 1287 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 1287 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Avinier/docker-llm-conversations | ---
license: mit
---
|
Jessiecs/llama-2-7b-a3-2-lima-backward | ---
dataset_info:
features:
- name: response
dtype: string
- name: instruction_generated
dtype: string
splits:
- name: train
num_bytes: 235129
num_examples: 131
download_size: 151792
dataset_size: 235129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_facebook__opt-125m | ---
pretty_name: Evaluation run of Facebook/OPT-125M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Facebook/OPT-125M](https://huggingface.co/Facebook/OPT-125M) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Facebook__OPT-125M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T14:31:42.504661](https://huggingface.co/datasets/open-llm-leaderboard/details_Facebook__OPT-125M/blob/main/results_2024-01-23T14-31-42.504661.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25971933524807705,\n\
\ \"acc_stderr\": 0.030727814194809005,\n \"acc_norm\": 0.26053348115143415,\n\
\ \"acc_norm_stderr\": 0.03151920852026647,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.42868550699768687,\n\
\ \"mc2_stderr\": 0.01505826026535896\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702256,\n\
\ \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540792\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2920732921728739,\n\
\ \"acc_stderr\": 0.004537865171414025,\n \"acc_norm\": 0.3143796056562438,\n\
\ \"acc_norm_stderr\": 0.00463319482579384\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361062,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361062\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891363,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891363\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.03036358219723816,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.03036358219723816\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.0360010569272777,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.0360010569272777\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n\
\ \"acc_stderr\": 0.03129843185743809,\n \"acc_norm\": 0.14285714285714285,\n\
\ \"acc_norm_stderr\": 0.03129843185743809\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"\
acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"\
acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.02403548967633506,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.02403548967633506\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23119266055045873,\n \"acc_stderr\": 0.01807575024163315,\n \"\
acc_norm\": 0.23119266055045873,\n \"acc_norm_stderr\": 0.01807575024163315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n\
\ \"acc_stderr\": 0.03077855467869326,\n \"acc_norm\": 0.25980392156862747,\n\
\ \"acc_norm_stderr\": 0.03077855467869326\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.02845882099146031,\n\
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.02845882099146031\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n\
\ \"acc_stderr\": 0.034057028381856924,\n \"acc_norm\": 0.15178571428571427,\n\
\ \"acc_norm_stderr\": 0.034057028381856924\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n\
\ \"acc_stderr\": 0.01551732236552963,\n \"acc_norm\": 0.2515964240102171,\n\
\ \"acc_norm_stderr\": 0.01551732236552963\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.02269865716785571,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.02269865716785571\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.2379421221864952,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n\
\ \"acc_stderr\": 0.011111715336101132,\n \"acc_norm\": 0.25358539765319427,\n\
\ \"acc_norm_stderr\": 0.011111715336101132\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.21895424836601307,\n \"acc_stderr\": 0.016729937565537537,\n \
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.016729937565537537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.42868550699768687,\n\
\ \"mc2_stderr\": 0.01505826026535896\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.516179952644041,\n \"acc_stderr\": 0.014045126130978601\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674316\n }\n}\n```"
repo_url: https://huggingface.co/Facebook/OPT-125M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|arc:challenge|25_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T00_45_29.121149
path:
- '**/details_harness|drop|3_2023-10-19T00-45-29.121149.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T00-45-29.121149.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T00_45_29.121149
path:
- '**/details_harness|gsm8k|5_2023-10-19T00-45-29.121149.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|gsm8k|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hellaswag|10_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:00:10.742260.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T14-31-42.504661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:00:10.742260.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T14-31-42.504661.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T00_45_29.121149
path:
- '**/details_harness|winogrande|5_2023-10-19T00-45-29.121149.parquet'
- split: 2024_01_23T14_31_42.504661
path:
- '**/details_harness|winogrande|5_2024-01-23T14-31-42.504661.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T14-31-42.504661.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_00_10.742260
path:
- results_2023-07-19T14:00:10.742260.parquet
- split: 2023_10_19T00_45_29.121149
path:
- results_2023-10-19T00-45-29.121149.parquet
- split: 2024_01_23T14_31_42.504661
path:
- results_2024-01-23T14-31-42.504661.parquet
- split: latest
path:
- results_2024-01-23T14-31-42.504661.parquet
---
# Dataset Card for Evaluation run of Facebook/OPT-125M
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Facebook/OPT-125M](https://huggingface.co/Facebook/OPT-125M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Facebook__OPT-125M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T14:31:42.504661](https://huggingface.co/datasets/open-llm-leaderboard/details_Facebook__OPT-125M/blob/main/results_2024-01-23T14-31-42.504661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25971933524807705,
"acc_stderr": 0.030727814194809005,
"acc_norm": 0.26053348115143415,
"acc_norm_stderr": 0.03151920852026647,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.42868550699768687,
"mc2_stderr": 0.01505826026535896
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702256,
"acc_norm": 0.22866894197952217,
"acc_norm_stderr": 0.012272853582540792
},
"harness|hellaswag|10": {
"acc": 0.2920732921728739,
"acc_stderr": 0.004537865171414025,
"acc_norm": 0.3143796056562438,
"acc_norm_stderr": 0.00463319482579384
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361062,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361062
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.03036358219723816,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.03036358219723816
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.0360010569272777,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.0360010569272777
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.03129843185743809,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.03129843185743809
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.02403548967633506,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.02403548967633506
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23119266055045873,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.23119266055045873,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.02845882099146031,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.02845882099146031
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.034057028381856924,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.034057028381856924
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.01551732236552963,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.01551732236552963
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.02269865716785571,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.02269865716785571
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.011111715336101132,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.011111715336101132
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.016729937565537537,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.016729937565537537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.42868550699768687,
"mc2_stderr": 0.01505826026535896
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978601
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674316
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DataFrek/Dairy_Data | ---
license: mit
---
|
projecte-aina/MentorCA | ---
license: cc-by-4.0
task_categories:
- question-answering
- text-generation
- summarization
language:
- ca
pretty_name: Mentor_CA
size_categories:
- 1K<n<10K
---
## Dataset Description
- **Homepage:** [Projecte AINA](https://projecteaina.cat/tech/)
- **Repository:** [HuggingFace](https://huggingface.co/projecte-aina)
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** langtech@bsc.es
### Dataset Summary
Mentor_CA is an open source dataset of 10,175 instructions in Catalan, machine translated from the original [Mentor_ES](https://huggingface.co/datasets/projecte-aina/MENTOR_ES) dataset in Spanish, and organized in several of the behavioral categories outlined in the [InstructGPT](https://arxiv.org/abs/2203.02155) paper, including closed QA, open QA, general QA, classification, information extraction, summarization, creative writing and brainstorming.
### Supported Tasks and Leaderboards
Useful for fine-tuning instructions in large language models for downstream tasks.
### Languages
This dataset is in Catalan (ca-ES).
## Dataset Structure
### Data Instances
The dataset is provided in JSON format, with the same fields as in the [Dolly databricks dataset](https://huggingface.co/datasets/databricks/databricks-dolly-15k), where each records corresponds to a single instruction-following instance and contains the category, the instruction, a context, if available, and the response.
| category | instruction | context | response |
|-----------|-------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------|
| open_qa | Qui va inventar el nus de corbata més usat del món? | | L'inventor del nus de corbata més usat del món el va inventar Eduard VIII, duc de Windsor. |
### Data Fields
- `category`: text string containing the type of instruction.
- `instruction`: text string containing the prompt.
- `context`: text string containing the information where the response is based on. These are only available for closed QA, information extraction and summarization.
- `answer`: text string containing the response to the instruction.
### Data Splits
We do not provide canonical splits for Mentor_CA other than the categories used for generating the dataset.
| Category | Number of instructions |
|----------------|------------------|
| Open_QA | 2500 |
| General_QA | 1500 |
| Classification | 1450 |
| Closed_QA | 1250 |
| Brainstorming | 1200 |
| Information_extraction | 1000 |
| Summarization | 800 |
| Creative_writing | 475 |
## Dataset Creation
### Curation Rationale
Mentor_CA is an open-source dataset of 10,175 records to enable large language models to exhibit conversational interactivity. Annotators were asked to create prompt-response pairs in each of eight different instruction categories, including the seven described in the InstructGPT paper, as well as an open-ended free-form category (General QA). Annotators were allowed to use information from any source on the web to gather text fragments for the `context` field in closed QA, information extraction and summarization, and were explicitly instructed to rephrase any response that came directly from the web. They were also asked to evenly distribute the number of questions with the number of topics, which are included in the [topic list file](https://huggingface.co/datasets/projecte-aina/MENTOR_ES/blob/main/topic_list.csv). Examples of each behavior were provided to motivate the types of questions and instructions appropriate for each category.
### Source Data
- **Human-generated data**: The annotators were asked to create prompt / response pairs in each of eight different instruction categories.
- **Web**: For instruction categories that require a reference text (closed QA, information extraction and summarization) contributors selected passages from any website. No guidance was given to annotators as to how to select the target passages. If any response was taken from the web, it had to be rephrased.
#### Initial Data Collection and Normalization
To create a dataset, annotators were given a brief description of the annotation task, as well as format specifications for prompts and responses separately. Examples were also provided for each task.
The guidelines were concise by design to encourage a high rate of task completion and freedom of writing. However, care was taken to ensure that the categories were clear and that the boundaries between them did not overlap. For example, closed QA was formulated to include questions that focused on the 5W interrogative pronouns: Who (quién), What (qué), When (cuándo), Where (dónde), Why (por qué); Information extraction could be confused with summarization or closed QA, so the prompt had to include a clear order to extract some kind of information from the given reference text.
#### Who are the source language producers?
The data was generated entirely by native Spanish annotators. Text obtained from the web for the `context` field was kept as is, while the `response` field was rewritten.
### Annotations
The annotation guidelines for each of the categories are as follows:
- **Closed QA** (closed_qa): Questions that can only be answered from a reference text. The annotators must provide a text from any web page and ask a question whose answer is found in the text.
- **Open QA** (open_qa): Questions of common culture that can be answered without consulting any source or with a simple search on the Internet.
- **General QA** (general_qa): Questions that are very general and do not necessarily have to be objective. In fact, it is desirable that they be as subjective as possible.
- **Classification** (classification): Questions that serve to obtain classifications or categorizations of a list of items in different categories to which they may belong.
- **Information Extraction** (inf_ext): Questions used to extract a list of data or information from a reference text.
- **Summarization** (summarization): Questions to ask for a summary or synthesis of a text provided by the annotator.
- **Creative Writing** (creative_wr): Questions that should be order-oriented to obtain an original text (a story, a letter, a song, an article, a poem, a narrative, etc.). original text (a story, a letter, a song, an article, a poem, a narrative, etc.).
- **Brainstorming** (brainstorming): Questions to obtain a list of ideas or possible options to an issue.
#### Annotation process
The annotators were divided into two groups, with one group collecting reference text and asking a question, and the other group providing a response to the instruction.
#### Who are the annotators?
While labels and text were produced by humans, no further information about the people or systems involved was provided when creating this resource.
### Personal and Sensitive Information
This dataset contains public information (e.g., some information from the web). To our knowledge, there are no private person’s personal identifiers or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
[N/A]
### Discussion of Biases
[N/A]
### Other Known Limitations
- The contents of this dataset may reflect the bias, factual errors and topical focus found in the web.
- Annotator demographics and subject matter may reflect the makeup of the annotators.
## Additional Information
### Dataset Curators
Language Technologies Unit (langtech@bsc.es) at the Barcelona Supercomputing Center (BSC).
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/ca/inici/) within the framework of [Projecte AINA](https://projecteaina.cat/tech/).
### Licensing Information
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/). Give appropriate credit , provide a link to the license, and indicate if changes were made.
### Citation Information
[N/A]
### Contributions
[N/A] |
CyberHarem/mercedes_von_martlitz_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mercedes_von_martlitz (Fire Emblem)
This is the dataset of mercedes_von_martlitz (Fire Emblem), containing 57 images and their tags.
The core tags of this character are `long_hair, blonde_hair, breasts, large_breasts, blue_eyes, bow, hair_bow, hat, straw_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 57 | 84.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mercedes_von_martlitz_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 57 | 44.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mercedes_von_martlitz_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 138 | 91.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mercedes_von_martlitz_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 57 | 72.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mercedes_von_martlitz_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 138 | 135.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mercedes_von_martlitz_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mercedes_von_martlitz_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, navel, sarong, smile, solo, black_bikini, collarbone, sun_hat, thighs, frilled_bikini, simple_background, bare_shoulders, brown_bikini, hand_on_headwear, official_alternate_costume, parted_lips, purple_eyes, sitting, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, bikini, bracelet, cleavage, cup, food, full_body, navel, sandals, simple_background, solo, blush, frills, nail_polish, collarbone, holding, sarong, white_background, grey_background, legs, one_eye_closed, open_mouth, parted_lips, see-through, smile, spoon, stomach, thighs, toes, torn_clothes |
| 2 | 5 |  |  |  |  |  | 1girl, collarbone, hair_over_shoulder, looking_at_viewer, nipples, solo, blush, completely_nude, parted_bangs, smile, thighs, navel, side_ponytail, simple_background, closed_mouth, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | cleavage | looking_at_viewer | navel | sarong | smile | solo | black_bikini | collarbone | sun_hat | thighs | frilled_bikini | simple_background | bare_shoulders | brown_bikini | hand_on_headwear | official_alternate_costume | parted_lips | purple_eyes | sitting | white_background | bikini | bracelet | cup | food | full_body | sandals | frills | nail_polish | holding | grey_background | legs | one_eye_closed | open_mouth | see-through | spoon | stomach | toes | torn_clothes | hair_over_shoulder | nipples | completely_nude | parted_bangs | side_ponytail | closed_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------|:--------------------|:--------|:---------|:--------|:-------|:---------------|:-------------|:----------|:---------|:-----------------|:--------------------|:-----------------|:---------------|:-------------------|:-----------------------------|:--------------|:--------------|:----------|:-------------------|:---------|:-----------|:------|:-------|:------------|:----------|:---------|:--------------|:----------|:------------------|:-------|:-----------------|:-------------|:--------------|:--------|:----------|:-------|:---------------|:---------------------|:----------|:------------------|:---------------|:----------------|:---------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | | X | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | | X | X | | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
owanr/o1o2o3_large_r2_iterater | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
sequence: string
splits:
- name: train
num_bytes: 5030346
num_examples: 7210
download_size: 2021918
dataset_size: 5030346
---
# Dataset Card for "o1o2o3_large_r2_iterater"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amitness/wikipedia_it | ---
language: it
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4809624134
num_examples: 1808474
download_size: 2865384809
dataset_size: 4809624134
---
# Dataset Card for "wikipedia_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LambdaTests/VQAv2_sample_validation_benchmarks_partition_global_13_loca_5 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 41
num_examples: 1
download_size: 0
dataset_size: 41
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_global_13_loca_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
avsolatorio/mteb-amazon_massive_scenario-avs_triplets | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype:
class_label:
names:
'0': social
'1': transport
'2': calendar
'3': play
'4': news
'5': datetime
'6': recommendation
'7': email
'8': iot
'9': general
'10': audio
'11': lists
'12': qa
'13': cooking
'14': takeaway
'15': music
'16': alarm
'17': weather
- name: label_text
dtype: string
- name: text
dtype: string
- name: idx
dtype: int64
- name: query_idx
dtype: int64
- name: positive_idx
dtype: int64
- name: negative_idx
dtype: int64
splits:
- name: train
num_bytes: 1119338
num_examples: 11514
download_size: 644764
dataset_size: 1119338
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# MTEB Amazon Massive Scenario Triplets Dataset
This dataset was used in the paper GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning. Refer to https://arxiv.org/abs/2402.16829 for details.
The code for generating the data is available at https://github.com/avsolatorio/GISTEmbed/blob/main/scripts/create_classification_dataset.py.
## Citation
```
@article{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
journal={arXiv preprint arXiv:2402.16829},
year={2024},
URL={https://arxiv.org/abs/2402.16829}
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
one-sec-cv12/chunk_64 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22254129504.75
num_examples: 231698
download_size: 19758813666
dataset_size: 22254129504.75
---
# Dataset Card for "chunk_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/databricks-dolly-15k-es-deepl | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: instruction_en
dtype: string
- name: context_en
dtype: string
- name: response_en
dtype: string
splits:
- name: train
num_bytes: 25838910
num_examples: 15015
download_size: 16464221
dataset_size: 25838910
---
# Dataset Card for "databricks-dolly-15k-es-deepl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/guichen_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of guichen/ギシャン/吉尚 (Azur Lane)
This is the dataset of guichen/ギシャン/吉尚 (Azur Lane), containing 22 images and their tags.
The core tags of this character are `long_hair, breasts, hat, large_breasts, white_headwear, witch_hat, earrings, bangs, purple_eyes, very_long_hair, blue_eyes, mole, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 38.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 18.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 39.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 32.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 62.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/guichen_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, bare_shoulders, solo, jewelry, looking_at_viewer, detached_sleeves, smile, white_dress, white_thighhighs, black_panties, crescent, thighs, navel, see-through, blush, witch |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | jewelry | looking_at_viewer | detached_sleeves | smile | white_dress | white_thighhighs | black_panties | crescent | thighs | navel | see-through | blush | witch |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:----------|:--------------------|:-------------------|:--------|:--------------|:-------------------|:----------------|:-----------|:---------|:--------|:--------------|:--------|:--------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
atmallen/inventions_azaria_mitchell | ---
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 36994.520547945205
num_examples: 700
- name: test
num_bytes: 9301.479452054795
num_examples: 176
download_size: 21827
dataset_size: 46296.0
---
# Dataset Card for "inventions_azaria_mitchell"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e3 | ---
pretty_name: Evaluation run of LordNoah/spin_gpt2_medium_alpaca_e3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LordNoah/spin_gpt2_medium_alpaca_e3](https://huggingface.co/LordNoah/spin_gpt2_medium_alpaca_e3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T08:22:40.774701](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e3/blob/main/results_2024-02-19T08-22-40.774701.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2687245385759416,\n\
\ \"acc_stderr\": 0.031261376728097146,\n \"acc_norm\": 0.27140216341935136,\n\
\ \"acc_norm_stderr\": 0.0321026383447453,\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460974,\n \"mc2\": 0.42196083615751423,\n\
\ \"mc2_stderr\": 0.01499759385979445\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.01217489663120261,\n\
\ \"acc_norm\": 0.2781569965870307,\n \"acc_norm_stderr\": 0.0130944699195388\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.31159131647082255,\n\
\ \"acc_stderr\": 0.004621972524152965,\n \"acc_norm\": 0.3881696873132842,\n\
\ \"acc_norm_stderr\": 0.004863375698153868\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926603,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438015,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438015\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.02489246917246284,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.02489246917246284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414359,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414359\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3384615384615385,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.3384615384615385,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276611,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276611\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279483,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279483\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3192660550458716,\n\
\ \"acc_stderr\": 0.019987829069750013,\n \"acc_norm\": 0.3192660550458716,\n\
\ \"acc_norm_stderr\": 0.019987829069750013\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.03338473403207401,\n\
\ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.03338473403207401\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501943,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501943\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13452914798206278,\n\
\ \"acc_stderr\": 0.022901183761575586,\n \"acc_norm\": 0.13452914798206278,\n\
\ \"acc_norm_stderr\": 0.022901183761575586\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.17355371900826447,\n \"acc_stderr\": 0.0345727283691767,\n \"\
acc_norm\": 0.17355371900826447,\n \"acc_norm_stderr\": 0.0345727283691767\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23076923076923078,\n\
\ \"acc_stderr\": 0.027601921381417614,\n \"acc_norm\": 0.23076923076923078,\n\
\ \"acc_norm_stderr\": 0.027601921381417614\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.014487500852850412,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.014487500852850412\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.02346842983245115,\n\
\ \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.02346842983245115\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n\
\ \"acc_stderr\": 0.011311347690633885,\n \"acc_norm\": 0.26792698826597133,\n\
\ \"acc_norm_stderr\": 0.011311347690633885\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3602941176470588,\n \"acc_stderr\": 0.029163128570670736,\n\
\ \"acc_norm\": 0.3602941176470588,\n \"acc_norm_stderr\": 0.029163128570670736\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2679738562091503,\n \"acc_stderr\": 0.017917974069594726,\n \
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.017917974069594726\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3469387755102041,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.3469387755102041,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401467,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401467\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.03240004825594689,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.03240004825594689\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209195,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209195\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20563035495716034,\n\
\ \"mc1_stderr\": 0.014148482219460974,\n \"mc2\": 0.42196083615751423,\n\
\ \"mc2_stderr\": 0.01499759385979445\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.014049294536290396\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/LordNoah/spin_gpt2_medium_alpaca_e3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|arc:challenge|25_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|gsm8k|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hellaswag|10_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T08-22-40.774701.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T08-22-40.774701.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- '**/details_harness|winogrande|5_2024-02-19T08-22-40.774701.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T08-22-40.774701.parquet'
- config_name: results
data_files:
- split: 2024_02_19T08_22_40.774701
path:
- results_2024-02-19T08-22-40.774701.parquet
- split: latest
path:
- results_2024-02-19T08-22-40.774701.parquet
---
# Dataset Card for Evaluation run of LordNoah/spin_gpt2_medium_alpaca_e3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LordNoah/spin_gpt2_medium_alpaca_e3](https://huggingface.co/LordNoah/spin_gpt2_medium_alpaca_e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T08:22:40.774701](https://huggingface.co/datasets/open-llm-leaderboard/details_LordNoah__spin_gpt2_medium_alpaca_e3/blob/main/results_2024-02-19T08-22-40.774701.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2687245385759416,
"acc_stderr": 0.031261376728097146,
"acc_norm": 0.27140216341935136,
"acc_norm_stderr": 0.0321026383447453,
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460974,
"mc2": 0.42196083615751423,
"mc2_stderr": 0.01499759385979445
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.01217489663120261,
"acc_norm": 0.2781569965870307,
"acc_norm_stderr": 0.0130944699195388
},
"harness|hellaswag|10": {
"acc": 0.31159131647082255,
"acc_stderr": 0.004621972524152965,
"acc_norm": 0.3881696873132842,
"acc_norm_stderr": 0.004863375698153868
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926603,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.02489246917246284,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.02489246917246284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414359,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414359
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3384615384615385,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.3384615384615385,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276611,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276611
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3192660550458716,
"acc_stderr": 0.019987829069750013,
"acc_norm": 0.3192660550458716,
"acc_norm_stderr": 0.019987829069750013
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.03338473403207401,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.03338473403207401
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13452914798206278,
"acc_stderr": 0.022901183761575586,
"acc_norm": 0.13452914798206278,
"acc_norm_stderr": 0.022901183761575586
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.17355371900826447,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.17355371900826447,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.2912621359223301,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.2912621359223301,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.027601921381417614,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.027601921381417614
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850412,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850412
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.02346842983245115,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.02346842983245115
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26792698826597133,
"acc_stderr": 0.011311347690633885,
"acc_norm": 0.26792698826597133,
"acc_norm_stderr": 0.011311347690633885
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3602941176470588,
"acc_stderr": 0.029163128570670736,
"acc_norm": 0.3602941176470588,
"acc_norm_stderr": 0.029163128570670736
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.017917974069594726,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.017917974069594726
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3469387755102041,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.3469387755102041,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401467,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401467
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594689,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594689
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209195,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209195
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460974,
"mc2": 0.42196083615751423,
"mc2_stderr": 0.01499759385979445
},
"harness|winogrande|5": {
"acc": 0.510655090765588,
"acc_stderr": 0.014049294536290396
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
manavshah0716/vgg16 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 164721373.0
num_examples: 1
download_size: 150546939
dataset_size: 164721373.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
akpotluri/NLPFinalProject | ---
license: openrail
---
|
distilled-from-one-sec-cv12/chunk_226 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1174324768
num_examples: 228824
download_size: 1200622866
dataset_size: 1174324768
---
# Dataset Card for "chunk_226"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/10k_prompts_avg_rating3_SPIN_iter0_top | ---
dataset_info:
features:
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 12123785.277867528
num_examples: 2785
- name: test
num_bytes: 1349505.7221324716
num_examples: 310
download_size: 7291620
dataset_size: 13473291.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_nomic-ai__gpt4all-j | ---
pretty_name: Evaluation run of nomic-ai/gpt4all-j
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nomic-ai/gpt4all-j](https://huggingface.co/nomic-ai/gpt4all-j) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nomic-ai__gpt4all-j\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-03T19:47:17.747407](https://huggingface.co/datasets/open-llm-leaderboard/details_nomic-ai__gpt4all-j/blob/main/results_2023-10-03T19-47-17.747407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2872188662344035,\n\
\ \"acc_stderr\": 0.03256180276717163,\n \"acc_norm\": 0.2903884169662422,\n\
\ \"acc_norm_stderr\": 0.03256130411949783,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777305,\n \"mc2\": 0.4277581711709451,\n\
\ \"mc2_stderr\": 0.014665895347989117\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38993174061433444,\n \"acc_stderr\": 0.014252959848892884,\n\
\ \"acc_norm\": 0.4197952218430034,\n \"acc_norm_stderr\": 0.014422181226303022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.483469428400717,\n\
\ \"acc_stderr\": 0.004987053652540272,\n \"acc_norm\": 0.6406094403505278,\n\
\ \"acc_norm_stderr\": 0.004788412062375707\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.32452830188679244,\n \"acc_stderr\": 0.028815615713432118,\n\
\ \"acc_norm\": 0.32452830188679244,\n \"acc_norm_stderr\": 0.028815615713432118\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.038270523579507554,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.038270523579507554\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.35172413793103446,\n \"acc_stderr\": 0.03979236637497412,\n\
\ \"acc_norm\": 0.35172413793103446,\n \"acc_norm_stderr\": 0.03979236637497412\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262893,\n \"\
acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32051282051282054,\n \"acc_stderr\": 0.023661296393964273,\n\
\ \"acc_norm\": 0.32051282051282054,\n \"acc_norm_stderr\": 0.023661296393964273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.029953823891887037,\n\
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.029953823891887037\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343588,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484255,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484255\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.029105220833224595,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.029105220833224595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.04320767807536669,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.04320767807536669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.015671006009339586,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.015671006009339586\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123563,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123563\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3366013071895425,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.3366013071895425,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3247588424437299,\n\
\ \"acc_stderr\": 0.026596782287697046,\n \"acc_norm\": 0.3247588424437299,\n\
\ \"acc_norm_stderr\": 0.026596782287697046\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621348,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621348\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n\
\ \"acc_stderr\": 0.011083276280441912,\n \"acc_norm\": 0.2516297262059974,\n\
\ \"acc_norm_stderr\": 0.011083276280441912\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280055,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280055\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884603,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884603\n },\n\
\ \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3673469387755102,\n\
\ \"acc_stderr\": 0.030862144921087558,\n \"acc_norm\": 0.3673469387755102,\n\
\ \"acc_norm_stderr\": 0.030862144921087558\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.3482587064676617,\n \"acc_stderr\": 0.033687874661154596,\n\
\ \"acc_norm\": 0.3482587064676617,\n \"acc_norm_stderr\": 0.033687874661154596\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2289156626506024,\n \"acc_stderr\": 0.03270745277352477,\n\
\ \"acc_norm\": 0.2289156626506024,\n \"acc_norm_stderr\": 0.03270745277352477\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n\
\ \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.30994152046783624,\n\
\ \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2827417380660955,\n \"mc1_stderr\": 0.015764770836777305,\n\
\ \"mc2\": 0.4277581711709451,\n \"mc2_stderr\": 0.014665895347989117\n\
\ }\n}\n```"
repo_url: https://huggingface.co/nomic-ai/gpt4all-j
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|arc:challenge|25_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- '**/details_harness|drop|3_2023-09-18T00-56-07.678298.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T00-56-07.678298.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- '**/details_harness|gsm8k|5_2023-09-18T00-56-07.678298.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T00-56-07.678298.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hellaswag|10_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T12:03:22.271414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T12_03_22.271414
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T12:03:22.271414.parquet'
- split: 2023_10_03T19_47_17.747407
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-17.747407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T19-47-17.747407.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- '**/details_harness|winogrande|5_2023-09-18T00-56-07.678298.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T00-56-07.678298.parquet'
- config_name: results
data_files:
- split: 2023_09_18T00_56_07.678298
path:
- results_2023-09-18T00-56-07.678298.parquet
- split: 2023_10_03T19_47_17.747407
path:
- results_2023-10-03T19-47-17.747407.parquet
- split: latest
path:
- results_2023-10-03T19-47-17.747407.parquet
---
# Dataset Card for Evaluation run of nomic-ai/gpt4all-j
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nomic-ai/gpt4all-j
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [nomic-ai/gpt4all-j](https://huggingface.co/nomic-ai/gpt4all-j) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nomic-ai__gpt4all-j",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-03T19:47:17.747407](https://huggingface.co/datasets/open-llm-leaderboard/details_nomic-ai__gpt4all-j/blob/main/results_2023-10-03T19-47-17.747407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2872188662344035,
"acc_stderr": 0.03256180276717163,
"acc_norm": 0.2903884169662422,
"acc_norm_stderr": 0.03256130411949783,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777305,
"mc2": 0.4277581711709451,
"mc2_stderr": 0.014665895347989117
},
"harness|arc:challenge|25": {
"acc": 0.38993174061433444,
"acc_stderr": 0.014252959848892884,
"acc_norm": 0.4197952218430034,
"acc_norm_stderr": 0.014422181226303022
},
"harness|hellaswag|10": {
"acc": 0.483469428400717,
"acc_stderr": 0.004987053652540272,
"acc_norm": 0.6406094403505278,
"acc_norm_stderr": 0.004788412062375707
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.32452830188679244,
"acc_stderr": 0.028815615713432118,
"acc_norm": 0.32452830188679244,
"acc_norm_stderr": 0.028815615713432118
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.038270523579507554,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.038270523579507554
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.35172413793103446,
"acc_stderr": 0.03979236637497412,
"acc_norm": 0.35172413793103446,
"acc_norm_stderr": 0.03979236637497412
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262893,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.023661296393964273,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.023661296393964273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343588,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484255,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484255
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224595,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.04320767807536669,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.04320767807536669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.038946411200447915,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.038946411200447915
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.015671006009339586,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.015671006009339586
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3366013071895425,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.3366013071895425,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3247588424437299,
"acc_stderr": 0.026596782287697046,
"acc_norm": 0.3247588424437299,
"acc_norm_stderr": 0.026596782287697046
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621348,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621348
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2516297262059974,
"acc_stderr": 0.011083276280441912,
"acc_norm": 0.2516297262059974,
"acc_norm_stderr": 0.011083276280441912
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280055,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280055
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3673469387755102,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.3673469387755102,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3482587064676617,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.3482587064676617,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2289156626506024,
"acc_stderr": 0.03270745277352477,
"acc_norm": 0.2289156626506024,
"acc_norm_stderr": 0.03270745277352477
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777305,
"mc2": 0.4277581711709451,
"mc2_stderr": 0.014665895347989117
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/blue_poison_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of blue_poison/アズリウス/蓝毒 (Arknights)
This is the dataset of blue_poison/アズリウス/蓝毒 (Arknights), containing 500 images and their tags.
The core tags of this character are `pink_hair, blue_eyes, long_hair, breasts, twintails, low_twintails, braid, twin_braids, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 877.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_poison_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 396.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_poison_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1293 | 908.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_poison_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 726.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_poison_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1293 | 1.40 GiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_poison_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/blue_poison_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, bare_shoulders, crop_top, earrings, leggings, midriff, navel, off_shoulder, official_alternate_costume, open_jacket, pink_choker, short_shorts, solo, spaghetti_strap, sports_bra, stomach, white_jacket, blue_shorts, cowboy_shot, holding_bottle, long_sleeves, looking_at_viewer, medium_breasts, standing, cleavage, black_pants, collarbone, white_background, hand_up |
| 1 | 9 |  |  |  |  |  | 1girl, bare_shoulders, crop_top, leggings, long_sleeves, looking_at_viewer, midriff, off_shoulder, official_alternate_costume, open_jacket, pink_choker, solo, spaghetti_strap, sports_bra, white_jacket, black_pants, blue_shorts, earrings, navel, sneakers, stomach, white_footwear, holding_bottle, short_shorts, sitting, cleavage, medium_breasts, water_bottle, collarbone, smile, full_body, white_background |
| 2 | 6 |  |  |  |  |  | 1girl, bare_shoulders, earrings, looking_at_viewer, off_shoulder, official_alternate_costume, open_jacket, pink_choker, simple_background, smile, solo, spaghetti_strap, sports_bra, upper_body, white_background, camisole, cleavage, collarbone, small_breasts, white_jacket, closed_mouth, crop_top, long_sleeves, medium_breasts |
| 3 | 5 |  |  |  |  |  | 1girl, blue_jacket, hooded_jacket, long_sleeves, looking_at_viewer, open_jacket, solo, white_shirt, collared_shirt, hood_up, open_mouth, upper_body, hair_between_eyes, holding |
| 4 | 5 |  |  |  |  |  | 1girl, black_ribbon, blue_jacket, collared_shirt, dress_shirt, hooded_jacket, long_sleeves, looking_at_viewer, neck_ribbon, open_jacket, solo, upper_body, white_shirt, blush, hood_up, simple_background, smile, closed_mouth, white_background, buttons, hands_up, medium_hair, open_mouth, suspenders |
| 5 | 5 |  |  |  |  |  | 1girl, black_shorts, blue_jacket, blush, crossbow, hood_up, hooded_jacket, long_sleeves, looking_at_viewer, open_jacket, pouch, simple_background, solo, white_background, white_shirt, closed_mouth, black_thighhighs, cowboy_shot, short_shorts, ahoge, character_name, collared_shirt, holding_gun, signature, suspender_shorts |
| 6 | 7 |  |  |  |  |  | 1girl, black_shorts, black_thighhighs, blue_jacket, holding_weapon, hooded_jacket, long_sleeves, open_jacket, solo, white_shirt, crossbow, shoes, water, full_body, hood_up, pouch, simple_background, suspender_shorts, parted_lips, white_background, ahoge, black_footwear, blue_footwear, collared_shirt, looking_at_viewer |
| 7 | 10 |  |  |  |  |  | 1girl, blush, open_mouth, 1boy, completely_nude, hetero, nipples, pussy, sex, navel, solo_focus, tongue_out, large_breasts, penis, sweat, vaginal, choker, cum, heart, jewelry, looking_at_viewer, mosaic_censoring, spread_legs |
| 8 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, looking_back, solo, ass_focus, cameltoe, from_behind, large_breasts, ahoge, black_bodysuit, blush, closed_mouth, clothing_cutout, panties, partially_visible_vulva, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | crop_top | earrings | leggings | midriff | navel | off_shoulder | official_alternate_costume | open_jacket | pink_choker | short_shorts | solo | spaghetti_strap | sports_bra | stomach | white_jacket | blue_shorts | cowboy_shot | holding_bottle | long_sleeves | looking_at_viewer | medium_breasts | standing | cleavage | black_pants | collarbone | white_background | hand_up | sneakers | white_footwear | sitting | water_bottle | smile | full_body | simple_background | upper_body | camisole | small_breasts | closed_mouth | blue_jacket | hooded_jacket | white_shirt | collared_shirt | hood_up | open_mouth | hair_between_eyes | holding | black_ribbon | dress_shirt | neck_ribbon | blush | buttons | hands_up | medium_hair | suspenders | black_shorts | crossbow | pouch | black_thighhighs | ahoge | character_name | holding_gun | signature | suspender_shorts | holding_weapon | shoes | water | parted_lips | black_footwear | blue_footwear | 1boy | completely_nude | hetero | nipples | pussy | sex | solo_focus | tongue_out | large_breasts | penis | sweat | vaginal | choker | cum | heart | jewelry | mosaic_censoring | spread_legs | looking_back | ass_focus | cameltoe | from_behind | black_bodysuit | clothing_cutout | panties | partially_visible_vulva | thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:-----------|:-----------|:----------|:--------|:---------------|:-----------------------------|:--------------|:--------------|:---------------|:-------|:------------------|:-------------|:----------|:---------------|:--------------|:--------------|:-----------------|:---------------|:--------------------|:-----------------|:-----------|:-----------|:--------------|:-------------|:-------------------|:----------|:-----------|:-----------------|:----------|:---------------|:--------|:------------|:--------------------|:-------------|:-----------|:----------------|:---------------|:--------------|:----------------|:--------------|:-----------------|:----------|:-------------|:--------------------|:----------|:---------------|:--------------|:--------------|:--------|:----------|:-----------|:--------------|:-------------|:---------------|:-----------|:--------|:-------------------|:--------|:-----------------|:--------------|:------------|:-------------------|:-----------------|:--------|:--------|:--------------|:-----------------|:----------------|:-------|:------------------|:---------|:----------|:--------|:------|:-------------|:-------------|:----------------|:--------|:--------|:----------|:---------|:------|:--------|:----------|:-------------------|:--------------|:---------------|:------------|:-----------|:--------------|:-----------------|:------------------|:----------|:--------------------------|:---------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | | | X | X | X | X | | X | X | X | | X | | | | X | X | X | | X | | X | X | | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | | | | X | X | | | | | | X | | | | | | X | | X | X | | | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | | | | | X | | X | X | | | | | | X | | X | X | | | | | | X | | | | | | | | X | | | | X | X | X | X | X | X | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | | | | X | X | | | | | | X | | | | | | | X | X | | | | | X | X | X | X | X | | | | | | | | | | | | X | X | X | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
Chident/aaa | ---
license: cc-by-nc-sa-4.0
---
|
anan-2024/twitter_dataset_1713014868 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1334310
num_examples: 3608
download_size: 659450
dataset_size: 1334310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zpn/bace_classification | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: bace_classification
size_categories:
- 1K<n<10K
source_datasets: []
tags:
- bio
- bio-chem
- molnet
- molecule-net
- biophysics
task_categories:
- other
task_ids: []
---
# Dataset Card for bace_classification
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: https://moleculenet.org/**
- **Repository: https://github.com/deepchem/deepchem/tree/master**
- **Paper: https://arxiv.org/abs/1703.00564**
### Dataset Summary
`bace_classification` is a dataset included in [MoleculeNet](https://moleculenet.org/). This dataset consists of qualitative (binary label) binding binding results for a set of inhibitors of human β-secretase 1(BACE-1).
## Dataset Structure
### Data Fields
Each split contains
* `smiles`: the [SMILES](https://en.wikipedia.org/wiki/Simplified_molecular-input_line-entry_system) representation of a molecule
* `selfies`: the [SELFIES](https://github.com/aspuru-guzik-group/selfies) representation of a molecule
* `target`: the binary label binding results
### Data Splits
The dataset is split into an 80/10/10 train/valid/test split using scaffold split.
### Source Data
#### Initial Data Collection and Normalization
Data was originially generated by the Pande Group at Standford
### Licensing Information
This dataset was originally released under an MIT license
### Citation Information
```
@misc{https://doi.org/10.48550/arxiv.1703.00564,
doi = {10.48550/ARXIV.1703.00564},
url = {https://arxiv.org/abs/1703.00564},
author = {Wu, Zhenqin and Ramsundar, Bharath and Feinberg, Evan N. and Gomes, Joseph and Geniesse, Caleb and Pappu, Aneesh S. and Leswing, Karl and Pande, Vijay},
keywords = {Machine Learning (cs.LG), Chemical Physics (physics.chem-ph), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Physical sciences, FOS: Physical sciences},
title = {MoleculeNet: A Benchmark for Molecular Machine Learning},
publisher = {arXiv},
year = {2017},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
Thanks to [@zanussbaum](https://github.com/zanussbaum) for adding this dataset.
|
Smuggling1710/vERP | ---
license: apache-2.0
---
|
wbxlala/sleep_edf_3he1 | ---
license: odc-by
---
|
biglam/hmd_newspapers | ---
dataset_info:
features:
- name: source
dtype: string
- name: title
dtype: string
- name: location
dtype: string
- name: date
dtype: timestamp[s]
- name: item_type
dtype: string
- name: word_count
dtype: int32
- name: ocr_quality_mean
dtype: float64
- name: ocr_quality_sd
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 14304741164
num_examples: 3065408
download_size: 9682476047
dataset_size: 14304741164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc0-1.0
task_categories:
- text-generation
language:
- en
tags:
- newspapers
pretty_name: Heritage Made Digital Newspapers
size_categories:
- 1M<n<10M
---
# Dataset Card for Heritage Made Digital Newspapers
## Table of Contents
- [Dataset Card for Heritage Made Digital Newspapers](#dataset-card-for-heritage-made-digital-newspapers)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://bl.iro.bl.uk/?locale=en
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains text extracted at the article level from historic digitised newspapers from the [Heritage Made Digital](https://bl.iro.bl.uk/collections/9a6a4cdd-2bfe-47bb-8c14-c0a5d100501f?locale=en) newspaper digitisation program at the [British Library](https://www.bl.uk/). The newspapers in the dataset were published between 1800 and 1896. This dataset contains ~2.5 billion tokens and 3,065,408 articles.
The dataset contains text generated from Optical Character Recognition software on digitised newspaper pages. This dataset includes the plain text from the OCR alongside some minimal metadata associated with the newspaper from which the text is derived and OCR confidence score information generated from the OCR software.
### Supported Tasks and Leaderboards
This dataset can be used for:
- historical research and digital humanities research
- training language models
- training historic language models
Whilst this dataset can be used for all of these tasks, it is important to understand that the dataset was not constructed in a representative way so it contains biases in terms of the newspapers and articles that are included (more on this below).
### Languages
The text in this dataset is in English that has been recognised by the OCR software. The OCR software used is generic commercial OCR software that has not been trained on historic newspapers. There are therefore many errors in the text. Some of the OCR in this text will be of such poor quality that is is incomprehensible to a human reader.
## Dataset Structure
### Data Instances
Each row in the dataset is an article from a newspaper as recognised by an OLR (Optical Layout Recognition) step in the digitisation process.
### Data Splits
There is one split in this dataset, the training split.
## Dataset Creation
### Curation Rationale
This dataset consists of public-domain newspapers published in the UK during the 19th Century. The majority of newspapers digitised in the UK are not freely available (even if they are out of copyright). The newspapers in this dataset were digitised specifically to be freely available but also to meet preservation goals for newspapers in poor condition. As a result, the newspapers chosen for digitisation are biased toward poor quality physical newspapers. This may in turn result in worse OCR.
### Source Data
The source data for this dataset is the digitised newspapers from the [Heritage Made Digital](https://bl.iro.bl.uk/collections/9a6a4cdd-2bfe-47bb-8c14-c0a5d100501f?locale=en) newspaper digitisation program. The newspapers in the dataset were published between 1800 and 1870.
### Dataset Curators
The original digitisation was carried out by the British Library. The dataset was created by the British Library in partnership with Findmypast.
This dataset was created by [@davanstrien](https://huggingface.co/davanstrien).
### Licensing Information
The newspapers in this dataset are in the public domain. The dataset is licensed under a [Creative Commons Zero v1.0 Universal](https://creativecommons.org/publicdomain/zero/1.0/) license.
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
TIMOTHI07/Pothole_classification | ---
license: mit
---
A dataset for efficient pothole classification which contains more than 400 images collected over 50 km road from kerala. |
dream-textures/textures-color-normal-1k | ---
dataset_info:
features:
- name: color
dtype: image
- name: normal
dtype: image
splits:
- name: train
num_bytes: 110631687.194
num_examples: 1426
download_size: 111043422
dataset_size: 110631687.194
license: cc0-1.0
task_categories:
- image-to-image
size_categories:
- 1K<n<10K
---
# textures-color-normal-1k
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The `textures-color-normal-1k` dataset is an image dataset of 1000+ color and normal map textures in 512x512 resolution.
The dataset was created for use in image to image tasks.
It contains a combination of CC0 procedural and photoscanned PBR materials from [ambientCG](https://ambientcg.com/).
## Dataset Structure
### Data Instances
Each data point contains a 512x512 color texture and the corresponding 512x512 normal map.
### Data Fields
* `color`: the color texture as a PIL image
* `normal`: the normal map as a PIL image
### Data Splits
| | train |
| -- | ----- |
| ambientCG | 1426 |
## Dataset Creation
### Curation Rationale
`textures-color-normal-1k` was created to provide an accesible source of data for automating 3D-asset creation workflows.
The [Dream Textures](https://github.com/carson-katri/dream-textures) add-on is one such tool providing AI automation in Blender.
By training models designed for image to image tasks, this particular use-case can be more accurately automated.
### Source Data
#### Initial Data Collection and Normalization
The data was obtained from [ambientCG](https://ambientcg.com/)'s CC0 textures. Only the color and normal maps were included in this dataset.
## Additional Information
### Dataset Curators
The dataset was created by Carson Katri, with the images being provided by [ambientCG](https://ambientcg.com/).
### Licensing Information
All of the images used in this dataset are CC0.
### Citation Information
[N/A]
### Contributions
Thanks to [@carson-katri](https://github.com/carson-katri) for adding this dataset. |
awacke1/LOINC-Clinical-Terminology | ---
license: mit
---
|
mnoukhov/openai_summarize_generated_10k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 18033740
num_examples: 10000
download_size: 10969719
dataset_size: 18033740
---
# Dataset Card for "openai_summarize_generated_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gagan3012/sentiment | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
- name: label_type
dtype: string
splits:
- name: train
num_bytes: 38958491
num_examples: 113394
download_size: 6843230
dataset_size: 38958491
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sentiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/kizaru | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/kizaru"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.610792 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/8d81c49a2d84e2a69faf1a725343874b.434x434x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/kizaru">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">kizaru</div>
<a href="https://genius.com/artists/kizaru">
<div style="text-align: center; font-size: 14px;">@kizaru</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/kizaru).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kizaru")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|200| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/kizaru")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
judy93536/fin_phrasebank_fix | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 256094
num_examples: 1924
- name: test
num_bytes: 47170
num_examples: 339
download_size: 171135
dataset_size: 303264
---
# Dataset Card for "fin_phrasebank_fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_psychology-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 7683
num_examples: 5
- name: test
num_bytes: 1743191
num_examples: 545
download_size: 18091
dataset_size: 1750874
---
# Dataset Card for "mmlu-high_school_psychology-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_pacozaa__mistral-sharegpt90k-merged_16bit | ---
pretty_name: Evaluation run of pacozaa/mistral-sharegpt90k-merged_16bit
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pacozaa/mistral-sharegpt90k-merged_16bit](https://huggingface.co/pacozaa/mistral-sharegpt90k-merged_16bit)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pacozaa__mistral-sharegpt90k-merged_16bit\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-11T14:56:08.552887](https://huggingface.co/datasets/open-llm-leaderboard/details_pacozaa__mistral-sharegpt90k-merged_16bit/blob/main/results_2024-04-11T14-56-08.552887.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6263229489745822,\n\
\ \"acc_stderr\": 0.03253179213124674,\n \"acc_norm\": 0.6323613683810546,\n\
\ \"acc_norm_stderr\": 0.03319481559039239,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.45256376156190653,\n\
\ \"mc2_stderr\": 0.014303520809083632\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348899,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009123\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6261700856403107,\n\
\ \"acc_stderr\": 0.004828305041904404,\n \"acc_norm\": 0.8287193786098387,\n\
\ \"acc_norm_stderr\": 0.003759840127150704\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n\
\ \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n\
\ \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n\
\ \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n\
\ \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"\
acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944437,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915434,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657569,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657569\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917202,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917202\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.288268156424581,\n\
\ \"acc_stderr\": 0.015149132860209438,\n \"acc_norm\": 0.288268156424581,\n\
\ \"acc_norm_stderr\": 0.015149132860209438\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983967,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983967\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.45256376156190653,\n\
\ \"mc2_stderr\": 0.014303520809083632\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3510235026535254,\n \
\ \"acc_stderr\": 0.01314694594139722\n }\n}\n```"
repo_url: https://huggingface.co/pacozaa/mistral-sharegpt90k-merged_16bit
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|arc:challenge|25_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|arc:challenge|25_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|gsm8k|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|gsm8k|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hellaswag|10_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hellaswag|10_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T08-28-29.120116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T14-56-08.552887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T14-56-08.552887.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- '**/details_harness|winogrande|5_2024-04-11T08-28-29.120116.parquet'
- split: 2024_04_11T14_56_08.552887
path:
- '**/details_harness|winogrande|5_2024-04-11T14-56-08.552887.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-11T14-56-08.552887.parquet'
- config_name: results
data_files:
- split: 2024_04_11T08_28_29.120116
path:
- results_2024-04-11T08-28-29.120116.parquet
- split: 2024_04_11T14_56_08.552887
path:
- results_2024-04-11T14-56-08.552887.parquet
- split: latest
path:
- results_2024-04-11T14-56-08.552887.parquet
---
# Dataset Card for Evaluation run of pacozaa/mistral-sharegpt90k-merged_16bit
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pacozaa/mistral-sharegpt90k-merged_16bit](https://huggingface.co/pacozaa/mistral-sharegpt90k-merged_16bit) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pacozaa__mistral-sharegpt90k-merged_16bit",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-11T14:56:08.552887](https://huggingface.co/datasets/open-llm-leaderboard/details_pacozaa__mistral-sharegpt90k-merged_16bit/blob/main/results_2024-04-11T14-56-08.552887.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6263229489745822,
"acc_stderr": 0.03253179213124674,
"acc_norm": 0.6323613683810546,
"acc_norm_stderr": 0.03319481559039239,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.45256376156190653,
"mc2_stderr": 0.014303520809083632
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.014515573873348899,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009123
},
"harness|hellaswag|10": {
"acc": 0.6261700856403107,
"acc_stderr": 0.004828305041904404,
"acc_norm": 0.8287193786098387,
"acc_norm_stderr": 0.003759840127150704
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944437,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657569,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657569
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917202,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917202
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.288268156424581,
"acc_stderr": 0.015149132860209438,
"acc_norm": 0.288268156424581,
"acc_norm_stderr": 0.015149132860209438
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983967,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983967
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.45256376156190653,
"mc2_stderr": 0.014303520809083632
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140934
},
"harness|gsm8k|5": {
"acc": 0.3510235026535254,
"acc_stderr": 0.01314694594139722
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AnhTong/wikipedia | ---
dataset_info:
- config_name: en
features:
- name: id
dtype: string
- name: revid
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 18018761534
num_examples: 14208911
download_size: 10329862748
dataset_size: 18018761534
- config_name: vi
features:
- name: id
dtype: string
- name: revid
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1480386957
num_examples: 1569734
download_size: 674677545
dataset_size: 1480386957
configs:
- config_name: en
data_files:
- split: train
path: en/train-*
- config_name: vi
data_files:
- split: train
path: vi/train-*
---
|
KGraph/FB15k-237 | ---
annotations_creators:
- found
- crowdsourced
language:
- en
language_creators: []
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: FB15k-237
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- knowledge graph
- knowledge
- link prediction
- link
task_categories:
- other
task_ids: []
---
# Dataset Card for FB15k-237
## Table of Contents
- [Dataset Card for FB15k-237](#dataset-card-for-fb15k-237)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://deepai.org/dataset/fb15k-237](https://deepai.org/dataset/fb15k-237)
- **Repository:**
- **Paper:** [More Information Needed](https://paperswithcode.com/dataset/fb15k-237)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
FB15k-237 is a link prediction dataset created from FB15k. While FB15k consists of 1,345 relations, 14,951 entities, and 592,213 triples, many triples are inverses that cause leakage from the training to testing and validation splits. FB15k-237 was created by Toutanova and Chen (2015) to ensure that the testing and evaluation datasets do not have inverse relation test leakage. In summary, FB15k-237 dataset contains 310,079 triples with 14,505 entities and 237 relation types.
### Supported Tasks and Leaderboards
Supported Tasks: link prediction task on knowledge graphs.
Leaderboads:
[More Information Needed](https://paperswithcode.com/sota/link-prediction-on-fb15k-237)
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@inproceedings{schlichtkrull2018modeling,
title={Modeling relational data with graph convolutional networks},
author={Schlichtkrull, Michael and Kipf, Thomas N and Bloem, Peter and Berg, Rianne van den and Titov, Ivan and Welling, Max},
booktitle={European semantic web conference},
pages={593--607},
year={2018},
organization={Springer}
}
```
### Contributions
Thanks to [@pp413](https://github.com/pp413) for adding this dataset. |
TryMore/n_grams_probability | ---
license: openrail
---
|
multi-train/sentence-compression_1107 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
- name: task
dtype: string
- name: instruction
struct:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
splits:
- name: train
num_bytes: 71393703
num_examples: 180000
download_size: 36617830
dataset_size: 71393703
---
# Dataset Card for "sentence-compression_1107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhan1993/flat_x_p_n | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: eval_task
dtype: string
- name: sources_texts
dtype: string
- name: positive_expert_names
dtype: string
- name: negative_expert_names
dtype: string
splits:
- name: train
num_bytes: 377510770
num_examples: 240050
download_size: 5902084
dataset_size: 377510770
---
# Dataset Card for "flat_x_p_n"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/alma_armas_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of alma_armas/アルマ・アルマス/阿尔玛·阿玛斯 (Girls' Frontline)
This is the dataset of alma_armas/アルマ・アルマス/阿尔玛·阿玛斯 (Girls' Frontline), containing 103 images and their tags.
The core tags of this character are `long_hair, blonde_hair, breasts, dark-skinned_female, ahoge, dark_skin, glasses, large_breasts, semi-rimless_eyewear, brown_eyes, over-rim_eyewear, black-framed_eyewear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 103 | 132.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alma_armas_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 103 | 72.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alma_armas_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 241 | 152.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alma_armas_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 103 | 117.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alma_armas_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 241 | 222.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alma_armas_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/alma_armas_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, ribbed_sweater, solo, sweater_lift, turtleneck_sweater, lifted_by_self, looking_at_viewer, mechanical_hands, navel, purple_sweater, nipples, no_bra, very_long_hair, bangs, panties, sitting, smile, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, solo, black_pants, looking_at_viewer, purple_sweater, ribbed_sweater, turtleneck_sweater, black_gloves, long_sleeves, mechanical_hands, smile, bangs, jeans, standing, beer_can, black_footwear, boots, holding_can |
| 2 | 6 |  |  |  |  |  | 1girl, ribbed_sweater, solo, looking_at_viewer, smile, turtleneck_sweater, mechanical_hands, upper_body, black_sweater, lips |
| 3 | 11 |  |  |  |  |  | 1boy, hetero, solo_focus, 1girl, penis, nipples, sweater_lift, blush, paizuri, ribbed_sweater, uncensored, clothed_female_nude_male, no_bra, cum, turtleneck_sweater, yellow_eyes, black_gloves, breasts_squeezed_together, looking_at_viewer, mechanical_hands, pov, smile, breasts_out, fellatio, huge_breasts, interracial, purple_sweater |
| 4 | 13 |  |  |  |  |  | 1girl, solo, looking_at_viewer, navel, smile, white_background, mechanical_hands, thighs, completely_nude, huge_ahoge, simple_background, barefoot, dark_nipples, full_body, laptop, armpits, cleavage, curvy, female_pubic_hair |
| 5 | 6 |  |  |  |  |  | 1girl, solo, black_bra, black_panties, looking_at_viewer, mechanical_hands, navel, cleavage, thighs, yellow_eyes, biting, black_gloves, lace-trimmed_bra, on_bed, sitting, smile, underwear_only |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | ribbed_sweater | solo | sweater_lift | turtleneck_sweater | lifted_by_self | looking_at_viewer | mechanical_hands | navel | purple_sweater | nipples | no_bra | very_long_hair | bangs | panties | sitting | smile | white_background | black_pants | black_gloves | long_sleeves | jeans | standing | beer_can | black_footwear | boots | holding_can | upper_body | black_sweater | lips | 1boy | hetero | solo_focus | penis | paizuri | uncensored | clothed_female_nude_male | cum | yellow_eyes | breasts_squeezed_together | pov | breasts_out | fellatio | huge_breasts | interracial | thighs | completely_nude | huge_ahoge | simple_background | barefoot | dark_nipples | full_body | laptop | armpits | cleavage | curvy | female_pubic_hair | black_bra | black_panties | biting | lace-trimmed_bra | on_bed | underwear_only |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:-------|:---------------|:---------------------|:-----------------|:--------------------|:-------------------|:--------|:-----------------|:----------|:---------|:-----------------|:--------|:----------|:----------|:--------|:-------------------|:--------------|:---------------|:---------------|:--------|:-----------|:-----------|:-----------------|:--------|:--------------|:-------------|:----------------|:-------|:-------|:---------|:-------------|:--------|:----------|:-------------|:---------------------------|:------|:--------------|:----------------------------|:------|:--------------|:-----------|:---------------|:--------------|:---------|:------------------|:-------------|:--------------------|:-----------|:---------------|:------------|:---------|:----------|:-----------|:--------|:--------------------|:------------|:----------------|:---------|:-------------------|:---------|:-----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | X | | X | | X | X | | X | | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | | X | | X | X | | | | | | | | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | | X | X | | X | X | | X | X | X | | | | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | | | X | | | | X | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | | | X | X | X | | | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | X | | | X | X | X | X | X | X |
|
wwydmanski/colorectal-carcinoma-microbiome-fengq | ---
task_categories:
- tabular-classification
tags:
- microbiome
- tabular
- gut-microbiota
pretty_name: Colorectal Carcinoma Feng Q 2015
size_categories:
- n<1K
---
## Publication Abstract
Colorectal cancer, a commonly diagnosed cancer in the elderly, often develops slowly from benign polyps called adenoma. The gut microbiota is believed to be directly involved in colorectal carcinogenesis. The identity and functional capacity of the adenoma- or carcinoma-related gut microbe(s), however, have not been surveyed in a comprehensive manner. Here we perform a metagenome-wide association study (MGWAS) on stools from advanced adenoma and carcinoma patients and from healthy subjects, revealing microbial genes, strains and functions enriched in each group. An analysis of potential risk factors indicates that high intake of red meat relative to fruits and vegetables appears to associate with outgrowth of bacteria that might contribute to a more hostile gut environment. These findings suggest that faecal microbiome-based strategies may be useful for early diagnosis and treatment of colorectal adenoma or carcinoma.
## Dataset
156 metagenomic shotgun-sequenced faecal samples from colorectal adenoma and carcinoma patients and healthy controls
### Configurations
- `presence-absence`
- `CLR`
## Usage
```python
dataset = load_dataset("wwydmanski/colorectal-carcinoma-microbiome-fengq", "presence-absence")
train_dataset, test_dataset = dataset['train'], dataset['test']
X_train = np.array(train_dataset['values'])
y_train = np.array(train_dataset['target'])
X_test = np.array(test_dataset['values'])
y_test = np.array(test_dataset['target'])
``` |
ibragim-bad/mmlu_he | ---
dataset_info:
features:
- name: ind
dtype: int64
- name: question
dtype: string
- name: choices
sequence: string
- name: subject
dtype: string
- name: answer
dtype: int64
- name: split
dtype: string
splits:
- name: test
num_bytes: 5953991
num_examples: 11380
- name: validation
num_bytes: 658530
num_examples: 1235
- name: dev
num_bytes: 100830
num_examples: 229
download_size: 3056768
dataset_size: 6713351
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: dev
path: data/dev-*
---
|
breno30/SimonDias | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713158136 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 5656
num_examples: 15
download_size: 10185
dataset_size: 5656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713158136"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aomocelin/commonvoice_13_0_pt_48kHz_simplificado_augmented_white_noise | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 11720605773.5
num_examples: 29020
- name: test
num_bytes: 281122914.928
num_examples: 9072
download_size: 11993487504
dataset_size: 12001728688.428
---
# Dataset Card for "commonvoice_13_0_pt_48kHz_simplificado_augmented_white_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangyz1230/nt-aligned | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 19376225
num_examples: 53003
- name: test
num_bytes: 2150643
num_examples: 5883
download_size: 10294835
dataset_size: 21526868
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
molchomen/colors | ---
license: mit
task_categories:
- text-classification
--- |
htdung167/fleurs-vi-preprocessed-v2 | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: original_sentence
dtype: string
- name: preprocessed_sentence
dtype: string
- name: preprocessed_sentence_v2
dtype: string
splits:
- name: train
num_bytes: 2093435217.394
num_examples: 2994
- name: validation
num_bytes: 275436921.0
num_examples: 361
- name: test
num_bytes: 692730041.0
num_examples: 857
download_size: 3041038548
dataset_size: 3061602179.394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Falah/sci_fi_fighter_plane_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 581050176
num_examples: 1000000
download_size: 76478423
dataset_size: 581050176
---
# Dataset Card for "sci_fi_fighter_plane_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sefika/semeval_prompts | ---
license: mit
---
|
krame-aims/assign3_fr_en_AIMS | ---
dataset_info:
features:
- name: fr
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 1542636
num_examples: 4000
download_size: 919826
dataset_size: 1542636
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AppleHarem/lava_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lava (Arknights)
This is the dataset of lava (Arknights), containing 63 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 63 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 156 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 176 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 63 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 63 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 63 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 156 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 156 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 100 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 176 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 176 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
XiangPan/ChnSentiCorp_htl_8k | ---
license: other
---
|
jamestalentium/xsum_10_test | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 15613650.659431798
num_examples: 6614
download_size: 0
dataset_size: 15613650.659431798
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "xsum_10_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aparnakesarkar/autotrain-data-auto-nlp-poc | ---
language:
- en
---
# AutoTrain Dataset for project: auto-nlp-poc
## Dataset Description
This dataset has been automatically processed by AutoTrain for project auto-nlp-poc.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"context": "Simultaneously with these conflicts, bison, a keystone species and the primary protein source that Native people had survived on for centuries were being destroyed. Some estimates say there were over 13 million bison in Montana in 1870. In 1875, General Philip Sheridan pleaded to a joint session of Congress to authorize the slaughtering of herds in order to deprive the Indians of their source of food. By 1884, commercial hunting had brought bison to the verge of extinction; only about 325 bison remained in the entire United States.",
"question": "In 1884 about how many bison remained?",
"answers.text": [
"about 325"
],
"answers.answer_start": [
484
],
"feat_id": [
"573425624776f4190066195a"
],
"feat_title": [
"Montana"
]
},
{
"context": "The two became friends, and for many years lived in close proximity in Paris, Chopin at 38 Rue de la Chauss\u00e9e-d'Antin, and Liszt at the H\u00f4tel de France on the Rue Lafitte, a few blocks away. They performed together on seven occasions between 1833 and 1841. The first, on 2 April 1833, was at a benefit concert organized by Hector Berlioz for his bankrupt Shakespearean actress wife Harriet Smithson, during which they played George Onslow's Sonata in F minor for piano duet. Later joint appearances included a benefit concert for the Benevolent Association of Polish Ladies in Paris. Their last appearance together in public was for a charity concert conducted for the Beethoven Memorial in Bonn, held at the Salle Pleyel and the Paris Conservatory on 25 and 26 April 1841.",
"question": "What was the charity that Liszt and Chopin last performed for?",
"answers.text": [
"the Beethoven Memorial in Bonn"
],
"answers.answer_start": [
665
],
"feat_id": [
"56cf7c234df3c31400b0d834"
],
"feat_title": [
"Fr\u00e9d\u00e9ric_Chopin"
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"context": "Value(dtype='string', id=None)",
"question": "Value(dtype='string', id=None)",
"answers.text": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"answers.answer_start": "Sequence(feature=Value(dtype='int32', id=None), length=-1, id=None)",
"feat_id": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"feat_title": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 69946 |
| valid | 17487 |
|
mphonkala/absolom_voice | ---
language:
- en
tags:
- code
---
This is a dataset we are going to use when training our airtificial intelligence model.
During our data gathering process and planning we realized that we needed a dataset to be able
to train the model using Google Colab -> and Huggingface is the best place of convenience to do this.
Using Google Colab, we will fetch whispher model and our dataset from huggingface and then train the dataset
so that we can get a real-time transcription model that is going to recognize and translate the lecture's voice
with almost 100% accuracy. |
enriquevillalbarod/piedras | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.