datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Multimodal-Fatima/VQAv2_test_split_4 | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_B_16_with_openai
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: test
num_bytes: 9135639447.0
num_examples: 44779
download_size: 1867482751
dataset_size: 9135639447.0
---
# Dataset Card for "VQAv2_test_split_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/hendrycks_math | ---
license: mit
--- |
aditnnda/gacoanGenteng | ---
license: apache-2.0
---
|
ShrinivasSK/en_hi_1 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 6435127.8
num_examples: 18000
- name: test
num_bytes: 715014.2
num_examples: 2000
download_size: 3824291
dataset_size: 7150142.0
---
# Dataset Card for "data_hi_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tippawan/test2-data-semi-trainulb-r3-WLV | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: prob
sequence: float64
- name: ifpass
sequence: int64
- name: pred
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 71056510
num_examples: 37243
download_size: 12950153
dataset_size: 71056510
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
andersonbcdefg/filtered_triples_with_margins | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: source
dtype: string
- name: qp_sim
dtype: float32
- name: qn_sim
dtype: float32
- name: pn_sim
dtype: float32
- name: margin
dtype: float64
splits:
- name: train
num_bytes: 1134373304
num_examples: 1246705
download_size: 692171169
dataset_size: 1134373304
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tap-M/Llama2-Uncensored-Orca-Lima-Style | ---
license: agpl-3.0
---
|
hatakeyama-llm-team/WikiBookJa | ---
license: cc-by-3.0
--- |
Sharathhebbar24/BeaverTails_filtered | ---
language:
- en
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- text-generation
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 78049982
num_examples: 161784
download_size: 40376094
dataset_size: 78049982
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Beaver Tails
This is a cleansed version of [PKU-Alignment/BeaverTails](https://huggingface.co/datasets/PKU-Alignment/BeaverTails)
It has two version based on the sensitivity
## Filtered
### Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/BeaverTails_filtered", split="train")
```
## Unfiltered
### Usage
```python
from datasets import load_dataset
dataset = load_dataset("Sharathhebbar24/BeaverTails_unfiltered", split="train")
```
|
mrbrain404/my_datasets | ---
license: other
---
|
elvis-d/train | ---
license: apache-2.0
---
|
mask-distilled-one-sec-cv12/chunk_165 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1171949260
num_examples: 230155
download_size: 1197293781
dataset_size: 1171949260
---
# Dataset Card for "chunk_165"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BelalElhossany/mgb2_audios_transcriptions | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1181823173.4
num_examples: 5842
download_size: 1141789958
dataset_size: 1181823173.4
---
# Dataset Card for "mgb2_audios_transcriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/VQAv2_modif-Dummy | Invalid username or password. |
Duskfallcrew/Alex_Brightman | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- lora
- lycoris
pretty_name: Alex Brightman
size_categories:
- 1K<n<10K
---
Alex Brightman Lycoris |
mnoukhov/openai_summarize_tldr_rbaseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: prompt
dtype: string
- name: label
dtype: string
- name: reward_baseline
dtype: float32
splits:
- name: train
num_bytes: 181611007
num_examples: 116722
- name: valid
num_bytes: 3120685
num_examples: 2000
download_size: 113145941
dataset_size: 184731692
---
# Dataset Card for "openai_summarize_tldr_rbaseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepapaikar/Sentence_pairs | ---
license: apache-2.0
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1901011
num_examples: 12560
download_size: 991351
dataset_size: 1901011
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ediudo/sal | ---
license: openrail
---
|
DBQ/Fendi.Standard.Categories.Italy | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Italy - Fendi - Fashion Standard Categories
tags:
- webscraping
- ecommerce
- Fendi
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: dbq_prd_type
dtype: string
- name: competence_date
dtype: string
- name: website_name
dtype: string
- name: product_code
dtype: string
- name: tag_field
dtype: string
splits:
- name: train
num_bytes: 271821
num_examples: 3055
download_size: 56203
dataset_size: 271821
---
# Fendi web scraped data
## About the website
In the **EMEA** region, particularly in **Italy**, the **luxury fashion industry** has an immense influence and it significantly contributes to Italys economy. Brands such as **Fendi** are prominent players in this sector. Emphasizing on haute couture, ready-to-wear clothing, leather goods, shoes, fragrances, eyewear, timepieces and accessories, the industry has seen significant growth with the adoption of **Ecommerce**. Our dataset provides an insightful look at Ecommerce product tags data on Fendi products in Italy. The research data aids in understanding the online buying behavior of luxury fashion consumers in Italy, providing great potential for market analysis and strategy development.
## Link to **dataset**
[Italy - Fendi - Fashion Standard Categories dataset](https://www.databoutique.com/buy-data-page/Fendi%20Standard%20Categories%20Italy/r/recrd9vOREnRQ68V1)
|
rgricardo/Takubgroup | ---
license: openrail
---
|
Susmita1302/images | ---
license: mit
---
|
vadimpetrov/common_voice_13_0_ru_pseudo_labelled | ---
dataset_info:
config_name: ru
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 991934032.405
num_examples: 26269
- name: validation
num_bytes: 401622741.215
num_examples: 10185
- name: test
num_bytes: 391004723.912
num_examples: 10186
download_size: 1724461730
dataset_size: 1784561497.5319998
configs:
- config_name: ru
data_files:
- split: train
path: ru/train-*
- split: validation
path: ru/validation-*
- split: test
path: ru/test-*
---
|
cvzion/dataset-dqg1 | ---
dataset_info:
features:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 57020
num_examples: 238
download_size: 16985
dataset_size: 57020
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/idw_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of idw/IDW/IDW (Girls' Frontline)
This is the dataset of idw/IDW/IDW (Girls' Frontline), containing 125 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, cat_ears, blue_eyes, long_hair, twintails, hair_ornament, bangs, hair_between_eyes, tail, fang, hairclip, cat_tail, breasts, animal_ear_fluff`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 142.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/idw_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 82.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/idw_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 295 | 177.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/idw_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 126.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/idw_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 295 | 249.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/idw_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/idw_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, fingerless_gloves, shorts, boots, headset, looking_at_viewer, necktie, open_mouth, suspenders, holding_gun, knee_pads, smile, full_body, panties, submachine_gun, :3, sleeves_rolled_up, white_shirt |
| 1 | 7 |  |  |  |  |  | 1girl, fingerless_gloves, headset, open_mouth, suspender_shorts, collared_shirt, white_shirt, black_shorts, holding_gun, sleeves_rolled_up, :d, blue_panties, character_name, cowboy_shot, short_shorts, submachine_gun, 2girls, looking_at_viewer, solo_focus, striped_necktie, v-shaped_eyebrows |
| 2 | 11 |  |  |  |  |  | 1girl, elbow_gloves, solo, black_dress, black_gloves, official_alternate_costume, open_mouth, smile, bare_shoulders, choker, small_breasts, collarbone, looking_at_viewer, strapless_dress, sunglasses, simple_background, thigh_strap, :3, gun, tail_ribbon, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | fingerless_gloves | shorts | boots | headset | looking_at_viewer | necktie | open_mouth | suspenders | holding_gun | knee_pads | smile | full_body | panties | submachine_gun | :3 | sleeves_rolled_up | white_shirt | suspender_shorts | collared_shirt | black_shorts | :d | blue_panties | character_name | cowboy_shot | short_shorts | 2girls | solo_focus | striped_necktie | v-shaped_eyebrows | elbow_gloves | black_dress | black_gloves | official_alternate_costume | bare_shoulders | choker | small_breasts | collarbone | strapless_dress | sunglasses | simple_background | thigh_strap | gun | tail_ribbon | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:--------|:----------|:--------------------|:----------|:-------------|:-------------|:--------------|:------------|:--------|:------------|:----------|:-----------------|:-----|:--------------------|:--------------|:-------------------|:-----------------|:---------------|:-----|:---------------|:-----------------|:--------------|:---------------|:---------|:-------------|:------------------|:--------------------|:---------------|:--------------|:---------------|:-----------------------------|:-----------------|:---------|:----------------|:-------------|:------------------|:-------------|:--------------------|:--------------|:------|:--------------|:-------------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | | | X | X | | X | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | | | | | X | | X | | | | X | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
timm/imagenet-12k-wds | ---
license: other
license_name: imagenet
license_link: https://www.image-net.org/download.php
task_categories:
- image-classification
pretty_name: ImageNet-12k
size_categories:
- 10M<n<100M
extra_gated_prompt: >-
By clicking on “Access repository” below, you also agree to ImageNet Terms of
Access:
[RESEARCHER_FULLNAME] (the "Researcher") has requested permission to use the
ImageNet database (the "Database") at Princeton University and Stanford
University. In exchange for such permission, Researcher hereby agrees to the
following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and
educational purposes.
2. Princeton University, Stanford University and Hugging Face make no
representations or warranties regarding the Database, including but not
limited to warranties of non-infringement or fitness for a particular purpose.
3. Researcher accepts full responsibility for his or her use of the Database
and shall defend and indemnify the ImageNet team, Princeton University,
Stanford University and Hugging Face, including their employees, Trustees,
officers and agents, against any and all claims arising from Researcher's use
of the Database, including but not limited to Researcher's use of any copies
of copyrighted images that he or she may create from the Database.
4. Researcher may provide research associates and colleagues with access to
the Database provided that they first agree to be bound by these terms and
conditions.
5. Princeton University, Stanford University and Hugging Face reserve the
right to terminate Researcher's access to the Database at any time.
6. If Researcher is employed by a for-profit, commercial entity, Researcher's
employer shall also be bound by these terms and conditions, and Researcher
hereby represents that he or she is fully authorized to enter into this
agreement on behalf of such employer.
7. The law of the State of New Jersey shall apply to all disputes under this
agreement.
tags:
- webdataset
---
## Dataset Description
- **Homepage:** https://image-net.org/index.php
- **Repository:** https://github.com/rwightman/imagenet-12k
- **Paper:** https://arxiv.org/abs/1409.0575
### Dataset Summary
This is a filtered copy of the full [ImageNet](https://www.image-net.org/) dataset consisting of the top 11821 (of 21841) classes by number of samples. It has been used to pretrain a number of `in12k` models in `timm`.
The code and metadata for building this dataset from the original full ImageNet can be found at https://github.com/rwightman/imagenet-12k
NOTE: This subset was filtered from the original `fall11` ImageNet release which has been replaced by the `winter21` release which removes close to 3000 synsets containing people, a number of these are of an offensive or sensitive nature. There is work in progress to filter a similar dataset from `winter21`, and there is already [ImageNet-21k-P](https://github.com/Alibaba-MIIL/ImageNet21K/blob/main/dataset_preprocessing/processing_instructions.md) but with different thresholds & preprocessing steps.
### Data Splits
Unlike ImageNet-1k (ILSVRC 2012), the full ImageNet dataset has no defined splits. This subset includes a validation split consiting of 40 samples per 11821 classes.
#### Train
* `imagenet12k-train-{0000..1023}.tar`
* 12129687 samples over 1024 shards
#### Validation
* `imagenet12k-validation-{0000..0511}.tar`
* 472840 samples over 512 shards
### Processing
I performed some processing while sharding this dataset:
* All exif tags not related to color space were removed
* All images with width or height < 48 were removed.
* All images with the smallest edge > 600 were resized, maintaining aspect so that they were = 600. Improving size & decoding time uniformity for typical pretrain use cases.
* Images were pre-shuffled across the shards
## Additional Information
### Dataset Curators
Authors of [[1]](https://arxiv.org/abs/1409.0575) and [[2]](https://ieeexplore.ieee.org/abstract/document/5206848):
- Olga Russakovsky
- Jia Deng
- Hao Su
- Jonathan Krause
- Sanjeev Satheesh
- Wei Dong
- Richard Socher
- Li-Jia Li
- Kai Li
- Sean Ma
- Zhiheng Huang
- Andrej Karpathy
- Aditya Khosla
- Michael Bernstein
- Alexander C Berg
- Li Fei-Fei
### Licensing Information
In exchange for permission to use the ImageNet database (the "Database") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and educational purposes.
1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.
1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.
1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
1. The law of the State of New Jersey shall apply to all disputes under this agreement.
### Citation Information
```bibtex
@article{imagenet15russakovsky,
Author = {Olga Russakovsky and Jia Deng and Hao Su and Jonathan Krause and Sanjeev Satheesh and Sean Ma and Zhiheng Huang and Andrej Karpathy and Aditya Khosla and Michael Bernstein and Alexander C. Berg and Li Fei-Fei},
Title = { {ImageNet Large Scale Visual Recognition Challenge} },
Year = {2015},
journal = {International Journal of Computer Vision (IJCV)},
doi = {10.1007/s11263-015-0816-y},
volume={115},
number={3},
pages={211-252}
}
``` |
RKnijn/ds-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: source
dtype: string
splits:
- name: train
num_bytes: 51117417
num_examples: 126287
download_size: 26326368
dataset_size: 51117417
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
daze-unlv/medmcqa-mistral-FT-1000 | ---
license: apache-2.0
---
|
Mariofm02/cards_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ace of clubs
'1': ace of diamonds
'2': ace of hearts
'3': ace of spades
'4': eight of clubs
'5': eight of diamonds
'6': eight of hearts
'7': eight of spades
'8': five of clubs
'9': five of diamonds
'10': five of hearts
'11': five of spades
'12': four of clubs
'13': four of diamonds
'14': four of hearts
'15': four of spades
'16': jack of clubs
'17': jack of diamonds
'18': jack of hearts
'19': jack of spades
'20': king of clubs
'21': king of diamonds
'22': king of hearts
'23': king of spades
'24': nine of clubs
'25': nine of diamonds
'26': nine of hearts
'27': nine of spades
'28': queen of clubs
'29': queen of diamonds
'30': queen of hearts
'31': queen of spades
'32': seven of clubs
'33': seven of diamonds
'34': seven of hearts
'35': seven of spades
'36': six of clubs
'37': six of diamonds
'38': six of hearts
'39': six of spades
'40': ten of clubs
'41': ten of diamonds
'42': ten of hearts
'43': ten of spades
'44': three of clubs
'45': three of diamonds
'46': three of hearts
'47': three of spades
'48': two of clubs
'49': two of diamonds
'50': two of hearts
'51': two of spades
splits:
- name: train
num_bytes: 130695849.8739789
num_examples: 6603
- name: test
num_bytes: 23038296.14202111
num_examples: 1166
download_size: 150881036
dataset_size: 153734146.016
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
rlinares2/chatbot_arena_embeddings_adav3 | ---
dataset_info:
features:
- name: question_embedding
sequence: float64
- name: answer_embeddings_a
sequence: float64
- name: answer_embeddings_b
sequence: float64
splits:
- name: train
num_bytes: 885024000
num_examples: 24000
download_size: 647518122
dataset_size: 885024000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_steve-cse__MelloGPT | ---
pretty_name: Evaluation run of steve-cse/MelloGPT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [steve-cse/MelloGPT](https://huggingface.co/steve-cse/MelloGPT) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_steve-cse__MelloGPT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T18:07:59.704375](https://huggingface.co/datasets/open-llm-leaderboard/details_steve-cse__MelloGPT/blob/main/results_2023-12-16T18-07-59.704375.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5580941639624716,\n\
\ \"acc_stderr\": 0.03400113351875304,\n \"acc_norm\": 0.5629530798562845,\n\
\ \"acc_norm_stderr\": 0.03471812426856797,\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5560955080575972,\n\
\ \"mc2_stderr\": 0.014788612119825833\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.014593487694937738,\n\
\ \"acc_norm\": 0.53839590443686,\n \"acc_norm_stderr\": 0.01456824555029636\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5646285600477993,\n\
\ \"acc_stderr\": 0.004947922692688834,\n \"acc_norm\": 0.761202947619996,\n\
\ \"acc_norm_stderr\": 0.004254771367531346\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353229,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353229\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n\
\ \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.04489539350270699,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.04489539350270699\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6548387096774193,\n \"acc_stderr\": 0.02704574657353433,\n \"\
acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.02704574657353433\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n \"\
acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.031618779179354115,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.031618779179354115\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7174311926605504,\n \"acc_stderr\": 0.019304243497707152,\n \"\
acc_norm\": 0.7174311926605504,\n \"acc_norm_stderr\": 0.019304243497707152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"\
acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.679324894514768,\n \"acc_stderr\": 0.0303819319499904,\n \
\ \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.0303819319499904\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847834,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847834\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077788,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077788\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643637,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643637\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468647,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468647\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662734,\n\
\ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662734\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39765319426336376,\n\
\ \"acc_stderr\": 0.012499840347460642,\n \"acc_norm\": 0.39765319426336376,\n\
\ \"acc_norm_stderr\": 0.012499840347460642\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5277777777777778,\n \"acc_stderr\": 0.020196594933541194,\n \
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.020196594933541194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.03401052620104089,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.03401052620104089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591673,\n \"mc2\": 0.5560955080575972,\n\
\ \"mc2_stderr\": 0.014788612119825833\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.012346914863415314\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3009855951478393,\n \
\ \"acc_stderr\": 0.0126345044652112\n }\n}\n```"
repo_url: https://huggingface.co/steve-cse/MelloGPT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|arc:challenge|25_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|gsm8k|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hellaswag|10_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T18-07-59.704375.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T18-07-59.704375.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- '**/details_harness|winogrande|5_2023-12-16T18-07-59.704375.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T18-07-59.704375.parquet'
- config_name: results
data_files:
- split: 2023_12_16T18_07_59.704375
path:
- results_2023-12-16T18-07-59.704375.parquet
- split: latest
path:
- results_2023-12-16T18-07-59.704375.parquet
---
# Dataset Card for Evaluation run of steve-cse/MelloGPT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [steve-cse/MelloGPT](https://huggingface.co/steve-cse/MelloGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_steve-cse__MelloGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T18:07:59.704375](https://huggingface.co/datasets/open-llm-leaderboard/details_steve-cse__MelloGPT/blob/main/results_2023-12-16T18-07-59.704375.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5580941639624716,
"acc_stderr": 0.03400113351875304,
"acc_norm": 0.5629530798562845,
"acc_norm_stderr": 0.03471812426856797,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5560955080575972,
"mc2_stderr": 0.014788612119825833
},
"harness|arc:challenge|25": {
"acc": 0.5247440273037542,
"acc_stderr": 0.014593487694937738,
"acc_norm": 0.53839590443686,
"acc_norm_stderr": 0.01456824555029636
},
"harness|hellaswag|10": {
"acc": 0.5646285600477993,
"acc_stderr": 0.004947922692688834,
"acc_norm": 0.761202947619996,
"acc_norm_stderr": 0.004254771367531346
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353229,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353229
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.04489539350270699,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.04489539350270699
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.031618779179354115,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.031618779179354115
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7174311926605504,
"acc_stderr": 0.019304243497707152,
"acc_norm": 0.7174311926605504,
"acc_norm_stderr": 0.019304243497707152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.032834720561085606,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.032834720561085606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.0303819319499904,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.0303819319499904
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.04656147110012351,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.04656147110012351
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077788,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077788
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643637,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643637
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468647,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468647
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662734,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662734
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39765319426336376,
"acc_stderr": 0.012499840347460642,
"acc_norm": 0.39765319426336376,
"acc_norm_stderr": 0.012499840347460642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.020196594933541194,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.020196594933541194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.03401052620104089,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.03401052620104089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591673,
"mc2": 0.5560955080575972,
"mc2_stderr": 0.014788612119825833
},
"harness|winogrande|5": {
"acc": 0.7387529597474349,
"acc_stderr": 0.012346914863415314
},
"harness|gsm8k|5": {
"acc": 0.3009855951478393,
"acc_stderr": 0.0126345044652112
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
diogofouto/dialogsum-augmented | ---
license: apache-2.0
---
# DialogSum Enhanced Dataset
## Overview
DialogSum Enhanced is an extension of the original DialogSum dataset, enriched with a new column called 'Relevant Sentences.' This dataset is designed to facilitate research in dialogue summarization by providing additional information about the dialogue turns that GPT-4 considers relevant for generating summaries.
### Changes from DialogSum
The primary enhancement in DialogSum Enhanced is the inclusion of the 'Relevant Sentences' column. This column contains the dialogue turns that GPT-4 identified as crucial for the generation of a summary. This information can be valuable for understanding the model's decision-making process and improving dialogue summarization models.
### Split Information
- **Train Split:** The train split in DialogSum Enhanced consists of half of the original DialogSum train split.
- **Test and Validation Sets:** The test and validation sets in DialogSum Enhanced retain their full length from the original DialogSum dataset.
## Dataset Structure
The dataset is provided in a CSV format with the following columns:
1. **id:** Unique identifier for each dialogue.
2. **dialogue:** The sequential turns of the dialogue.
3. **relevant_sentences:** The dialogue turns that GPT-4 considered relevant for generating the summary.
4. **summary:** The reference summary for the dialogue.
## Usage
Researchers and practitioners interested in dialogue summarization can leverage DialogSum Enhanced for training, validating, and testing their models. The 'Relevant Sentences' column provides additional insights into the model's decision-making process during summarization. |
atifss/gpt2-124M-qlora-chat-support | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 17924
num_examples: 79
download_size: 9896
dataset_size: 17924
---
# Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1 | ---
pretty_name: Evaluation run of ZoidBB/unraveled-7b-a1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZoidBB/unraveled-7b-a1](https://huggingface.co/ZoidBB/unraveled-7b-a1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T19:22:53.071269](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public/blob/main/results_2023-11-23T19-22-53.071269.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.627016601295214,\n\
\ \"acc_stderr\": 0.03235619334922418,\n \"acc_norm\": 0.6365794887378388,\n\
\ \"acc_norm_stderr\": 0.03306927475699416,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.42228384526614654,\n\
\ \"mc2_stderr\": 0.014152177395393957,\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.00043200973460387867,\n \"f1\": 0.06056837248322149,\n\
\ \"f1_stderr\": 0.0013671084143061485\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186045,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578274\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.635929097789285,\n\
\ \"acc_stderr\": 0.004801852881329739,\n \"acc_norm\": 0.8280223063134834,\n\
\ \"acc_norm_stderr\": 0.0037658983649388736\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"\
acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.039439666991836285,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.039439666991836285\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"\
acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876163,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876163\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.015382845587584517,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.015382845587584517\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206247,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.42228384526614654,\n\
\ \"mc2_stderr\": 0.014152177395393957\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \
\ \"em_stderr\": 0.00043200973460387867,\n \"f1\": 0.06056837248322149,\n\
\ \"f1_stderr\": 0.0013671084143061485\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.14329037149355572,\n \"acc_stderr\": 0.009650895723357585\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ZoidBB/unraveled-7b-a1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|drop|3_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T19-22-53.071269.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- '**/details_harness|winogrande|5_2023-11-23T19-22-53.071269.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T19-22-53.071269.parquet'
- config_name: results
data_files:
- split: 2023_11_23T19_22_53.071269
path:
- results_2023-11-23T19-22-53.071269.parquet
- split: latest
path:
- results_2023-11-23T19-22-53.071269.parquet
---
# Dataset Card for Evaluation run of ZoidBB/unraveled-7b-a1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ZoidBB/unraveled-7b-a1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ZoidBB/unraveled-7b-a1](https://huggingface.co/ZoidBB/unraveled-7b-a1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T19:22:53.071269](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__unraveled-7b-a1_public/blob/main/results_2023-11-23T19-22-53.071269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.627016601295214,
"acc_stderr": 0.03235619334922418,
"acc_norm": 0.6365794887378388,
"acc_norm_stderr": 0.03306927475699416,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42228384526614654,
"mc2_stderr": 0.014152177395393957,
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460387867,
"f1": 0.06056837248322149,
"f1_stderr": 0.0013671084143061485
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186045,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.635929097789285,
"acc_stderr": 0.004801852881329739,
"acc_norm": 0.8280223063134834,
"acc_norm_stderr": 0.0037658983649388736
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.039439666991836285,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.039439666991836285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876163,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876163
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584517,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206247,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.42228384526614654,
"mc2_stderr": 0.014152177395393957
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663592
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460387867,
"f1": 0.06056837248322149,
"f1_stderr": 0.0013671084143061485
},
"harness|gsm8k|5": {
"acc": 0.14329037149355572,
"acc_stderr": 0.009650895723357585
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
WangX0111/MLIR-Passes | ---
license: mit
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
MLIR file associated passes, crawled from the official LLVM/MLIR repository.
### Supported Tasks and Leaderboards
Code LLM for MLIR optimization
### Languages
MLIR
|
Oivalf23/jhony | ---
license: openrail
---
|
flax-sentence-embeddings/stackexchange_titlebody_best_voted_answer_jsonl | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
pretty_name: stackexchange
size_categories:
- unknown
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- closed-domain-qa
---
# Dataset Card Creation Guide
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)s
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [stackexchange](https://archive.org/details/stackexchange)
- **Repository:** [flax-sentence-embeddings](https://github.com/nreimers/flax-sentence-embeddings)
### Dataset Summary
We automatically extracted question and answer (Q&A) pairs from [Stack Exchange](https://stackexchange.com/) network. Stack Exchange gather many Q&A communities across 50 online plateform, including the well known Stack Overflow and other technical sites. 100 millon developpers consult Stack Exchange every month. The dataset is a parallel corpus with each question mapped to the top rated answer. The dataset is split given communities which cover a variety of domains from 3d printing, economics, raspberry pi or emacs. An exhaustive list of all communities is available [here](https://stackexchange.com/sites).
### Languages
Stack Exchange mainly consist of english language (en).
## Dataset Structure
### Data Instances
Each data samples is presented as follow:
```
{'title_body': 'How to determine if 3 points on a 3-D graph are collinear? Let the points $A, B$ and $C$ be $(x_1, y_1, z_1), (x_2, y_2, z_2)$ and $(x_3, y_3, z_3)$ respectively. How do I prove that the 3 points are collinear? What is the formula?',
'upvoted_answer': 'From $A(x_1,y_1,z_1),B(x_2,y_2,z_2),C(x_3,y_3,z_3)$ we can get their position vectors.\n\n$\\vec{AB}=(x_2-x_1,y_2-y_1,z_2-z_1)$ and $\\vec{AC}=(x_3-x_1,y_3-y_1,z_3-z_1)$.\n\nThen $||\\vec{AB}\\times\\vec{AC}||=0\\implies A,B,C$ collinear.',
```
This particular exampe corresponds to the [following page](https://math.stackexchange.com/questions/947555/how-to-determine-if-3-points-on-a-3-d-graph-are-collinear)
### Data Fields
The fields present in the dataset contain the following informations:
- `title_body`: This is the concatenation of the title and body from the question
- `upvoted_answer`: This is the body from the most upvoted answer
### Data Splits
We provide multiple splits for this dataset, which each refers to a given community channel. We detail the number of pail for each split below:
| | Number of pairs |
| ----- | ------ |
| apple | 92,487 |
| english | 100,640 |
| codereview | 41,748 |
| dba | 71,449 |
| mathoverflow | 85,289 |
| electronics | 129,494 |
| mathematica | 59,895 |
| drupal | 67,817 |
| magento | 79,241 |
| gaming | 82,887 |
| ell | 77,892 |
| gamedev | 40,154 |
| gis | 100,254 |
| askubuntu | 267,135 |
| diy | 52,896 |
| academia | 32,137 |
| blender | 54,153 |
| cs | 30,010 |
| chemistry | 27,061 |
| judaism | 26,085 |
| crypto | 19,404 |
| android | 38,077 |
| ja | 17,376 |
| christianity | 11,498 |
| graphicdesign | 28,083 |
| aviation | 18,755 |
| ethereum | 26,124 |
| biology | 19,277 |
| datascience | 20,503 |
| law | 16,133 |
| dsp | 17,430 |
| japanese | 20,948 |
| hermeneutics | 9,516 |
| bicycles | 15,708 |
| arduino | 16,281 |
| history | 10,766 |
| bitcoin | 22,474 |
| cooking | 22,641 |
| hinduism | 8,999 |
| codegolf | 8,211 |
| boardgames | 11,805 |
| emacs | 16,830 |
| economics | 8,844 |
| gardening | 13,246 |
| astronomy | 9,086 |
| islam | 10,052 |
| german | 13,733 |
| fitness | 8,297 |
| french | 10,578 |
| anime | 10,131 |
| craftcms | 11,236 |
| cstheory | 7,742 |
| engineering | 8,649 |
| buddhism | 6,787 |
| linguistics | 6,843 |
| ai | 5,763 |
| expressionengine | 10,742 |
| cogsci | 5,101 |
| chinese | 8,646 |
| chess | 6,392 |
| civicrm | 10,648 |
| literature | 3,539 |
| interpersonal | 3,398 |
| health | 4,494 |
| avp | 6,450 |
| earthscience | 4,396 |
| joomla | 5,887 |
| homebrew | 5,608 |
| expatriates | 4,913 |
| latin | 3,969 |
| matheducators | 2,706 |
| ham | 3,501 |
| genealogy | 2,895 |
| 3dprinting | 3,488 |
| elementaryos | 5,917 |
| bioinformatics | 3,135 |
| devops | 3,462 |
| hsm | 2,517 |
| italian | 3,101 |
| computergraphics | 2,306 |
| martialarts | 1,737 |
| bricks | 3,530 |
| freelancing | 1,663 |
| crafts | 1,659 |
| lifehacks | 2,576 |
| cseducators | 902 |
| materials | 1,101 |
| hardwarerecs | 2,050 |
| iot | 1,359 |
| eosio | 1,940 |
| languagelearning | 948 |
| korean | 1,406 |
| coffee | 1,188 |
| esperanto | 1,466 |
| beer | 1,012 |
| ebooks | 1,107 |
| iota | 775 |
| cardano | 248 |
| drones | 496 |
| conlang | 334 |
| pt | 103,277 |
| stats | 115,679 |
| unix | 155,414 |
| physics | 141,230 |
| tex | 171,628 |
| serverfault | 238,507 |
| salesforce | 87,272 |
| wordpress | 83,621 |
| softwareengineering | 51,326 |
| scifi | 54,805 |
| security | 51,355 |
| ru | 253,289 |
| superuser | 352,610 |
| sharepoint | 80,420 |
| rpg | 40,435 |
| travel | 36,533 |
| worldbuilding | 26,210 |
| meta | 1,000 |
| workplace | 24,012 |
| ux | 28,901 |
| money | 29,404 |
| webmasters | 30,370 |
| raspberrypi | 24,143 |
| photo | 23,204 |
| music | 19,936 |
| philosophy | 13,114 |
| puzzling | 17,448 |
| movies | 18,243 |
| quant | 12,933 |
| politics | 11,047 |
| space | 12,893 |
| mechanics | 18,613 |
| skeptics | 8,145 |
| rus | 16,528 |
| writers | 9,867 |
| webapps | 24,867 |
| softwarerecs | 11,761 |
| networkengineering | 12,590 |
| parenting | 5,998 |
| scicomp | 7,036 |
| sqa | 9,256 |
| sitecore | 7,838 |
| vi | 9,000 |
| spanish | 7,675 |
| pm | 5,435 |
| pets | 6,156 |
| sound | 8,303 |
| reverseengineering | 5,817 |
| outdoors | 5,278 |
| tridion | 5,907 |
| retrocomputing | 3,907 |
| robotics | 4,648 |
| quantumcomputing | 4,320 |
| sports | 4,707 |
| russian | 3,937 |
| opensource | 3,221 |
| woodworking | 2,955 |
| patents | 3,573 |
| tor | 4,167 |
| ukrainian | 1,767 |
| opendata | 3,842 |
| monero | 3,508 |
| sustainability | 1,674 |
| portuguese | 1,964 |
| mythology | 1,595 |
| musicfans | 2,431 |
| or | 1,490 |
| poker | 1,665 |
| windowsphone | 2,807 |
| moderators | 504 |
| stackapps | 1,518 |
| stellar | 1,078 |
| vegetarianism | 585 |
| tezos | 1,169 |
| total | 4,750,619 |
## Dataset Creation
### Curation Rationale
We primary designed this dataset for sentence embeddings training. Indeed sentence embeddings may be trained using a contrastive learning setup for which the model is trained to associate each sentence with its corresponding pair out of multiple proposition. Such models require many examples to be efficient and thus the dataset creation may be tedious. Community networks such as Stack Exchange allow us to build many examples semi-automatically.
### Source Data
The source data are dumps from [Stack Exchange](https://archive.org/details/stackexchange)
#### Initial Data Collection and Normalization
We collected the data from the math community.
We filtered out questions which title or body length is bellow 20 characters and questions for which body length is above 4096 characters.
When extracting most upvoted answer, we filtered to pairs for which their is at least 100 votes gap between most upvoted and downvoted answers.
#### Who are the source language producers?
Questions and answers are written by the community developpers of Stack Exchange.
## Additional Information
### Licensing Information
Please see the license information at: https://archive.org/details/stackexchange
### Citation Information
```
@misc{StackExchangeDataset,
author = {Flax Sentence Embeddings Team},
title = {Stack Exchange question pairs},
year = {2021},
howpublished = {https://huggingface.co/datasets/flax-sentence-embeddings/},
}
```
### Contributions
Thanks to the Flax Sentence Embeddings team for adding this dataset. |
allenai/multinews_dense_max | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: Multi-News
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids:
- news-articles-summarization
paperswithcode_id: multi-news
train-eval-index:
- config: default
task: summarization
task_id: summarization
splits:
train_split: train
eval_split: test
col_mapping:
document: text
summary: target
metrics:
- type: rouge
name: Rouge
---
This is a copy of the [Multi-News](https://huggingface.co/datasets/multi_news) dataset, except the input source documents of its `test` split have been replaced by a __dense__ retriever. The retrieval pipeline used:
- __query__: The `summary` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits
- __retriever__: [`facebook/contriever-msmarco`](https://huggingface.co/facebook/contriever-msmarco) via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"max"`, i.e. the number of documents retrieved, `k`, is set as the maximum number of documents seen across examples in this dataset, in this case `k==10`
Retrieval results on the `train` set:
Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8661 | 0.6867 | 0.2118 | 0.7966 |
Retrieval results on the `validation` set:
Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8626 | 0.6859 | 0.2083 | 0.7949 |
Retrieval results on the `test` set:
Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8625 | 0.6927 | 0.2096 | 0.7971 | |
distilled-one-sec-cv12-each-chunk-uniq/chunk_68 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1114218784.0
num_examples: 217112
download_size: 1136538595
dataset_size: 1114218784.0
---
# Dataset Card for "chunk_68"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt | ---
pretty_name: Evaluation run of HWERI/pythia-70m-deduped-cleansharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HWERI/pythia-70m-deduped-cleansharegpt](https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T05:46:53.221413](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt/blob/main/results_2023-10-29T05-46-53.221413.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.0,\n \"f1_stderr\": 0.0,\n \"\
acc\": 0.2600631412786109,\n \"acc_stderr\": 0.007020548332172166\n },\n\
\ \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n\
\ \"f1\": 0.0,\n \"f1_stderr\": 0.0\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5201262825572218,\n \"acc_stderr\": 0.014041096664344332\n\
\ }\n}\n```"
repo_url: https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T05_46_53.221413
path:
- '**/details_harness|drop|3_2023-10-29T05-46-53.221413.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T05-46-53.221413.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T05_46_53.221413
path:
- '**/details_harness|gsm8k|5_2023-10-29T05-46-53.221413.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T05-46-53.221413.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-28-53.949092.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-28-53.949092.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T05_46_53.221413
path:
- '**/details_harness|winogrande|5_2023-10-29T05-46-53.221413.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T05-46-53.221413.parquet'
- config_name: results
data_files:
- split: 2023_09_13T12_28_53.949092
path:
- results_2023-09-13T12-28-53.949092.parquet
- split: 2023_10_29T05_46_53.221413
path:
- results_2023-10-29T05-46-53.221413.parquet
- split: latest
path:
- results_2023-10-29T05-46-53.221413.parquet
---
# Dataset Card for Evaluation run of HWERI/pythia-70m-deduped-cleansharegpt
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HWERI/pythia-70m-deduped-cleansharegpt](https://huggingface.co/HWERI/pythia-70m-deduped-cleansharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T05:46:53.221413](https://huggingface.co/datasets/open-llm-leaderboard/details_HWERI__pythia-70m-deduped-cleansharegpt/blob/main/results_2023-10-29T05-46-53.221413.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0,
"acc": 0.2600631412786109,
"acc_stderr": 0.007020548332172166
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.0,
"f1_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5201262825572218,
"acc_stderr": 0.014041096664344332
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
miehin/test13aug | ---
license: openrail
---
|
arthurmluz/GPTextSum_data-wiki_gptextsum_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
splits:
- name: validation
num_bytes: 28733
num_examples: 20
download_size: 35161
dataset_size: 28733
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum_data-wiki_gptextsum_results"
rouge= {'rouge1': 0.47135937656590865, 'rouge2': 0.25940674061381325, 'rougeL': 0.38504581702889945, 'rougeLsum': 0.38504581702889945}
bert= {'precision': 0.788231149315834, 'recall': 0.7805202662944793, 'f1': 0.7840157747268677} |
lhallee/abstract_domain_copd | ---
dataset_info:
features:
- name: a
dtype: string
- name: b
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 477301584
num_examples: 132453
- name: valid
num_bytes: 9596971
num_examples: 2676
- name: test
num_bytes: 4758204
num_examples: 1294
download_size: 200765538
dataset_size: 491656759
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
This dataset contains the cocitation abstracts related to COPD in the paper [Contrastive Learning and Mixture of Experts Enables Precise Vector Embeddings](arxiv.org/abs/2401.15713)
|
AdapterOcean/physics_dataset_standardized_cluster_4_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7334043
num_examples: 3437
download_size: 0
dataset_size: 7334043
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_cluster_4_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pourmand1376/alpaca-fa-multi | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: conversation
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 2694606
num_examples: 3907
download_size: 1227811
dataset_size: 2694606
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- question-answering
- text-generation
- conversational
language:
- fa
pretty_name: Alpaca Farsi Multi-turn
size_categories:
- 10K<n<100K
---
# Dataset Card for "alpaca-fa-multi"
This dataset is first published [here](https://www.kaggle.com/datasets/amirpourmand/alpaca-farsi) and then converted to this style to comply with open-assistant standards.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DataHammer/emotional_dialog | ---
license: apache-2.0
task_categories:
- text-generation
- conversational
language:
- zh
size_categories:
- 1K<n<10K
---
# Scientific Emotional Dialogue
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is a dataset for emotional multi-turn dialogue on scientific research personnels. It consists of 1069 dialogues with 2709 turns. The Dialogue was first written by NLP practitioners and then expanded by GPT4.
### Supported Tasks and Leaderboards
- Emotional Dialogue: The dataset can be used to instruction tuning for emotional dialogue.
### Languages
Chinese
## Dataset Structure
### Data Instances
A typical instance in the dataset:
```
[
"Human: 真是郁闷,我的论文被拒了,我觉得自己完全失败了。",
"Bot: 嘿,别这么想嘛,论文被拒并不代表你是个失败者。审稿人的意见只是一种反馈,你可以从中学到很多东西。别放弃,修改论文并尝试其他期刊,我相信你能成功的!",
"Human: 嗯,或许你说得对。但我还是觉得很失望,觉得自己好像没进步。",
"Bot: 失望是正常的情绪,但不要让它打击你。论文写作是一个成长的过程,每次经历都是宝贵的。相信自己的能力,继续努力,你一定能取得进步的!",
"Human: 谢谢你的鼓励,我会再试一次的。",
"Bot: 太棒了!相信自己,你可以做到的!"
]
```
|
gracefulibk/yorubamt | ---
license: afl-3.0
---
|
Asap7772/skewlognormal_maxlength | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: output
dtype: string
- name: text
dtype: string
- name: alpaca_text
dtype: string
- name: prompt
dtype: string
- name: alpaca_prompt
dtype: string
- name: y_ref
dtype: string
- name: y_1
dtype: string
- name: y_2
dtype: string
- name: y_w
dtype: string
- name: y_w_alpaca
dtype: string
- name: y_l
dtype: string
- name: y_l_alpaca
dtype: string
- name: y_w_score
dtype: float64
- name: y_l_score
dtype: float64
- name: score_diff
dtype: float64
splits:
- name: train
num_bytes: 77844991
num_examples: 19000
- name: test
num_bytes: 4082779
num_examples: 1000
download_size: 40225253
dataset_size: 81927770
---
# Dataset Card for "skewlognormal_maxlength"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sxdave/emotion_detection | ---
task_categories:
- image-classification
language:
- en
tags:
- happy
- sad
- neutral
size_categories:
- n<1K
--- |
mizunorlk/mizuno | ---
license: openrail
---
|
xaviviro/oasst1_ca | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: validation
num_bytes: 4690639
num_examples: 4399
- name: train
num_bytes: 89877756
num_examples: 84398
download_size: 33007102
dataset_size: 94568395
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
language:
- ca
--- |
open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test | ---
pretty_name: Evaluation run of lu-vae/llama2-13b-sharegpt4-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lu-vae/llama2-13b-sharegpt4-test](https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T11:44:44.255542](https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test/blob/main/results_2023-10-25T11-44-44.255542.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002202181208053691,\n\
\ \"em_stderr\": 0.00048005108166192744,\n \"f1\": 0.06610843120805385,\n\
\ \"f1_stderr\": 0.0014639889801036593,\n \"acc\": 0.44600618846762125,\n\
\ \"acc_stderr\": 0.01064352054021588\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002202181208053691,\n \"em_stderr\": 0.00048005108166192744,\n\
\ \"f1\": 0.06610843120805385,\n \"f1_stderr\": 0.0014639889801036593\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \
\ \"acc_stderr\": 0.00929849923558786\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843902\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T11_44_44.255542
path:
- '**/details_harness|drop|3_2023-10-25T11-44-44.255542.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T11-44-44.255542.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T11_44_44.255542
path:
- '**/details_harness|gsm8k|5_2023-10-25T11-44-44.255542.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T11-44-44.255542.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-41-26.998548.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T16-41-26.998548.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T11_44_44.255542
path:
- '**/details_harness|winogrande|5_2023-10-25T11-44-44.255542.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T11-44-44.255542.parquet'
- config_name: results
data_files:
- split: 2023_09_12T16_41_26.998548
path:
- results_2023-09-12T16-41-26.998548.parquet
- split: 2023_10_25T11_44_44.255542
path:
- results_2023-10-25T11-44-44.255542.parquet
- split: latest
path:
- results_2023-10-25T11-44-44.255542.parquet
---
# Dataset Card for Evaluation run of lu-vae/llama2-13b-sharegpt4-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lu-vae/llama2-13b-sharegpt4-test](https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T11:44:44.255542](https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test/blob/main/results_2023-10-25T11-44-44.255542.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002202181208053691,
"em_stderr": 0.00048005108166192744,
"f1": 0.06610843120805385,
"f1_stderr": 0.0014639889801036593,
"acc": 0.44600618846762125,
"acc_stderr": 0.01064352054021588
},
"harness|drop|3": {
"em": 0.002202181208053691,
"em_stderr": 0.00048005108166192744,
"f1": 0.06610843120805385,
"f1_stderr": 0.0014639889801036593
},
"harness|gsm8k|5": {
"acc": 0.13115996967399546,
"acc_stderr": 0.00929849923558786
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843902
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ravithejads/alpaca_urdu_cleaned_input | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: telugu_instruction
dtype: string
- name: telugu_input
dtype: string
- name: telugu_output
dtype: string
- name: telugu_transliterated_instruction
dtype: string
- name: telugu_transliterated_input
dtype: string
- name: telugu_transliterated_output
dtype: string
- name: urdu_instruction
dtype: string
- name: urdu_input
dtype: string
splits:
- name: train
num_bytes: 148976621
num_examples: 28910
download_size: 67681432
dataset_size: 148976621
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dhuynh95/Magicoder-Evol-Instruct-110K-Filtered_0.35-text | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10765290
num_examples: 5000
download_size: 5317750
dataset_size: 10765290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlekseyKorshuk/intothatgoodnight-guanaco-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 69125408
num_examples: 50000
download_size: 38037513
dataset_size: 69125408
---
# Dataset Card for "intothatgoodnight-guanaco-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_no_rare_v4_train_30_eval_10_recite_ans_sent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 581992
num_examples: 368
- name: validation
num_bytes: 48145
num_examples: 50
download_size: 112955
dataset_size: 630137
---
# Dataset Card for "squad_no_rare_v4_train_30_eval_10_recite_ans_sent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OpenPipe/hacker-news | ---
dataset_info:
features:
- name: id
dtype: int64
- name: type
dtype: string
- name: by
dtype: string
- name: time
dtype: timestamp[us]
- name: title
dtype: string
- name: text
dtype: string
- name: url
dtype: string
- name: score
dtype: float64
- name: parent
dtype: float64
- name: top_level_parent
dtype: int64
- name: descendants
dtype: float64
- name: kids
sequence: int64
- name: deleted
dtype: bool
- name: dead
dtype: bool
splits:
- name: train
num_bytes: 16886975696
num_examples: 38109500
download_size: 9948795138
dataset_size: 16886975696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Hacker News posts and comments
This is a dataset of all HN posts and comments, current as of November 1, 2023. |
hamza50/testimg | ---
license: wtfpl
---
|
bagusatmaja/securityobjectives | ---
language:
- en
--- |
zz545906747/blue_eye_dragon | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3 | ---
pretty_name: Evaluation run of shaohang/Sparse0.5_OPT-1.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shaohang/Sparse0.5_OPT-1.3](https://huggingface.co/shaohang/Sparse0.5_OPT-1.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T21:48:19.303713](https://huggingface.co/datasets/open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3/blob/main/results_2023-09-16T21-48-19.303713.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n\
\ \"em_stderr\": 0.0005734993648436398,\n \"f1\": 0.047173867449664536,\n\
\ \"f1_stderr\": 0.0012666649528854216,\n \"acc\": 0.29319675461487227,\n\
\ \"acc_stderr\": 0.007301498172995543\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436398,\n\
\ \"f1\": 0.047173867449664536,\n \"f1_stderr\": 0.0012666649528854216\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225237\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.585635359116022,\n \"acc_stderr\": 0.013844846232268563\n\
\ }\n}\n```"
repo_url: https://huggingface.co/shaohang/Sparse0.5_OPT-1.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- '**/details_harness|drop|3_2023-09-16T21-48-19.303713.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T21-48-19.303713.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-48-19.303713.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-48-19.303713.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- '**/details_harness|winogrande|5_2023-09-16T21-48-19.303713.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T21-48-19.303713.parquet'
- config_name: results
data_files:
- split: 2023_09_16T21_48_19.303713
path:
- results_2023-09-16T21-48-19.303713.parquet
- split: latest
path:
- results_2023-09-16T21-48-19.303713.parquet
---
# Dataset Card for Evaluation run of shaohang/Sparse0.5_OPT-1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/shaohang/Sparse0.5_OPT-1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [shaohang/Sparse0.5_OPT-1.3](https://huggingface.co/shaohang/Sparse0.5_OPT-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T21:48:19.303713](https://huggingface.co/datasets/open-llm-leaderboard/details_shaohang__Sparse0.5_OPT-1.3/blob/main/results_2023-09-16T21-48-19.303713.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436398,
"f1": 0.047173867449664536,
"f1_stderr": 0.0012666649528854216,
"acc": 0.29319675461487227,
"acc_stderr": 0.007301498172995543
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436398,
"f1": 0.047173867449664536,
"f1_stderr": 0.0012666649528854216
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225237
},
"harness|winogrande|5": {
"acc": 0.585635359116022,
"acc_stderr": 0.013844846232268563
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aashay96/translated-dataset-synthetic-retrieval-tasks | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_mncai__mistral-7b-dpo-merge-v1.1 | ---
pretty_name: Evaluation run of mncai/mistral-7b-dpo-merge-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mncai/mistral-7b-dpo-merge-v1.1](https://huggingface.co/mncai/mistral-7b-dpo-merge-v1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__mistral-7b-dpo-merge-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T08:24:07.649575](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__mistral-7b-dpo-merge-v1.1/blob/main/results_2023-12-18T08-24-07.649575.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653910016295201,\n\
\ \"acc_stderr\": 0.03202751255847507,\n \"acc_norm\": 0.6535699044300431,\n\
\ \"acc_norm_stderr\": 0.03269009606304505,\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6847598329649042,\n\
\ \"mc2_stderr\": 0.015228501222479747\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002503,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7080262895837482,\n\
\ \"acc_stderr\": 0.004537410615572944,\n \"acc_norm\": 0.8814977096195977,\n\
\ \"acc_norm_stderr\": 0.0032254141192897138\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n\
\ \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n\
\ \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n\
\ \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6847598329649042,\n\
\ \"mc2_stderr\": 0.015228501222479747\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918753\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \
\ \"acc_stderr\": 0.012513215297888463\n }\n}\n```"
repo_url: https://huggingface.co/mncai/mistral-7b-dpo-merge-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-24-07.649575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T08-24-07.649575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- '**/details_harness|winogrande|5_2023-12-18T08-24-07.649575.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T08-24-07.649575.parquet'
- config_name: results
data_files:
- split: 2023_12_18T08_24_07.649575
path:
- results_2023-12-18T08-24-07.649575.parquet
- split: latest
path:
- results_2023-12-18T08-24-07.649575.parquet
---
# Dataset Card for Evaluation run of mncai/mistral-7b-dpo-merge-v1.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mncai/mistral-7b-dpo-merge-v1.1](https://huggingface.co/mncai/mistral-7b-dpo-merge-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__mistral-7b-dpo-merge-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T08:24:07.649575](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__mistral-7b-dpo-merge-v1.1/blob/main/results_2023-12-18T08-24-07.649575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653910016295201,
"acc_stderr": 0.03202751255847507,
"acc_norm": 0.6535699044300431,
"acc_norm_stderr": 0.03269009606304505,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6847598329649042,
"mc2_stderr": 0.015228501222479747
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002503,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7080262895837482,
"acc_stderr": 0.004537410615572944,
"acc_norm": 0.8814977096195977,
"acc_norm_stderr": 0.0032254141192897138
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4770949720670391,
"acc_stderr": 0.016704945740326188,
"acc_norm": 0.4770949720670391,
"acc_norm_stderr": 0.016704945740326188
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378459,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378459
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6847598329649042,
"mc2_stderr": 0.015228501222479747
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918753
},
"harness|gsm8k|5": {
"acc": 0.7088703563305534,
"acc_stderr": 0.012513215297888463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maxidl/MathInstruct-de | ---
dataset_info:
features:
- name: source
dtype: string
- name: messages_en
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages_de
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 87671574
num_examples: 56793
download_size: 43803287
dataset_size: 87671574
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
German version of [TIGER-Lab/MathInstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct). Translated using DeepL (informal style).
Currently limited to the first 56_793 examples.
|lang|#chars|
|---|---|
|en|38_793_154|
|de|42_851_569|
|
kannanwisen/Indian-Traffic-Sign-Classification | ---
license: cc-by-4.0
---
|
ghomasHudson/muld_NarrativeQA | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
sequence: string
splits:
- name: test
num_bytes: 3435452065
num_examples: 10143
- name: train
num_bytes: 11253796383
num_examples: 32747
- name: validation
num_bytes: 1176625993
num_examples: 3373
download_size: 8819172017
dataset_size: 15865874441
---
# Dataset Card for "muld_NarrativeQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alirzb/WrongPreds | ---
dataset_info:
features:
- name: array
sequence: float64
- name: label_true
dtype: int64
- name: label_pred
dtype: int64
- name: id
dtype: string
- name: ws
dtype: image
splits:
- name: train
num_bytes: 4330982.0
num_examples: 9
download_size: 1745905
dataset_size: 4330982.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nielsr/datacomp_small_with_language | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: clip_b32_similarity_score
dtype: float32
- name: clip_l14_similarity_score
dtype: float32
- name: face_bboxes
sequence:
sequence: float64
- name: sha256
dtype: string
- name: detected_language
dtype: string
splits:
- name: train
num_bytes: 4108589578
num_examples: 12800000
download_size: 3339674305
dataset_size: 4108589578
---
# Dataset Card for "datacomp_small_with_language"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pixparse/docvqa-wds | ---
license: mit
---
# Dataset Card for DocVQA Dataset
## Dataset Description
- **Point of Contact from curators:** [Minesh Mathew](mailto:minesh.mathew@gmail.com), [Dimosthenis Karatzas](dimos@cvc.uab.es), [C. V. Jawahar](jawahar@iiit.ac.in)
- **Point of Contact Hugging Face:** [Pablo Montalvo](mailto:pablo@huggingface.co)
### Dataset Summary
DocVQA dataset is a document dataset introduced in Mathew et al. (2021) consisting of 50,000 questions defined on 12,000+ document images.
This is a webdataset (.tar shard) version equivalent to https://huggingface.co/datasets/pixparse/docvqa-single-page-questions in content.
Please visit the challenge page (https://rrc.cvc.uab.es/?ch=17) and paper (https://arxiv.org/abs/2007.00398) for further information.
### Usage
This dataset can be used with current releases of Hugging Face `datasets` or `chug` libraries.
Examples WIP
### Data Splits
#### Train
* 10194 images, 39463 questions and answers.
### Validation
* 1286 images, 5349 questions and answers.
### Test
* 1,287 images, 5,188 questions.
## Additional Information
### Dataset Curators
For original authors of the dataset, see citation below.
Hugging Face points of contact for this instance: Pablo Montalvo, Ross Wightman
### Licensing Information
MIT
### Citation Information
```bibtex
@InProceedings{docvqa_wacv,
author = {Mathew, Minesh and Karatzas, Dimosthenis and Jawahar, C.V.},
title = {DocVQA: A Dataset for VQA on Document Images},
booktitle = {WACV},
year = {2021},
pages = {2200-2209}
}
``` |
sankovic/jotapedataset | ---
license: openrail
---
|
AlekseyKorshuk/product-photography-v1-categorized | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: category
dtype: string
- name: product
dtype: string
- name: prompt
dtype: string
- name: image_mask
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6054931158.644
num_examples: 3606
- name: validation
num_bytes: 289483345.0
num_examples: 176
download_size: 6341502964
dataset_size: 6344414503.644
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_allknowingroger__Synthiallamaguanco-7B-slerp | ---
pretty_name: Evaluation run of allknowingroger/Synthiallamaguanco-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/Synthiallamaguanco-7B-slerp](https://huggingface.co/allknowingroger/Synthiallamaguanco-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__Synthiallamaguanco-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T20:19:16.142131](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__Synthiallamaguanco-7B-slerp/blob/main/results_2024-04-10T20-19-16.142131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5022519883105377,\n\
\ \"acc_stderr\": 0.03435165246376145,\n \"acc_norm\": 0.5073979301587215,\n\
\ \"acc_norm_stderr\": 0.03511097439217402,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.455854048100616,\n\
\ \"mc2_stderr\": 0.01483939681310818\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285012,\n\
\ \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.01455594976049644\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5936068512248556,\n\
\ \"acc_stderr\": 0.004901558132335522,\n \"acc_norm\": 0.7909778928500298,\n\
\ \"acc_norm_stderr\": 0.00405779217189357\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n\
\ \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.5138888888888888,\n\
\ \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579862,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579862\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101806,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101806\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n\
\ \"acc_stderr\": 0.028372287797962935,\n \"acc_norm\": 0.535483870967742,\n\
\ \"acc_norm_stderr\": 0.028372287797962935\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448652,\n\
\ \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448652\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073845,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073845\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6844036697247706,\n \"acc_stderr\": 0.01992611751386967,\n \"\
acc_norm\": 0.6844036697247706,\n \"acc_norm_stderr\": 0.01992611751386967\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802749,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6862745098039216,\n \"acc_stderr\": 0.03256685484460388,\n \"\
acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.03256685484460388\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392923,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392923\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6973180076628352,\n\
\ \"acc_stderr\": 0.016428781581749364,\n \"acc_norm\": 0.6973180076628352,\n\
\ \"acc_norm_stderr\": 0.016428781581749364\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.026854257928258875,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.026854257928258875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.02850980780262659,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.02850980780262659\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668777,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668777\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n\
\ \"acc_stderr\": 0.012337391684530312,\n \"acc_norm\": 0.3709256844850065,\n\
\ \"acc_norm_stderr\": 0.012337391684530312\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.020227834851568375,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.020227834851568375\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n\
\ \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5673469387755102,\n\
\ \"acc_stderr\": 0.031717528240626645,\n \"acc_norm\": 0.5673469387755102,\n\
\ \"acc_norm_stderr\": 0.031717528240626645\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.6417910447761194,\n \"acc_stderr\": 0.03390393042268814,\n\
\ \"acc_norm\": 0.6417910447761194,\n \"acc_norm_stderr\": 0.03390393042268814\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n\
\ \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6900584795321637,\n\
\ \"acc_stderr\": 0.035469769593931624,\n \"acc_norm\": 0.6900584795321637,\n\
\ \"acc_norm_stderr\": 0.035469769593931624\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n\
\ \"mc2\": 0.455854048100616,\n \"mc2_stderr\": 0.01483939681310818\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.744277821625888,\n\
\ \"acc_stderr\": 0.012261253845440473\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.18726307808946172,\n \"acc_stderr\": 0.010745914199510815\n\
\ }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/Synthiallamaguanco-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|arc:challenge|25_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|gsm8k|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hellaswag|10_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-19-16.142131.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T20-19-16.142131.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- '**/details_harness|winogrande|5_2024-04-10T20-19-16.142131.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T20-19-16.142131.parquet'
- config_name: results
data_files:
- split: 2024_04_10T20_19_16.142131
path:
- results_2024-04-10T20-19-16.142131.parquet
- split: latest
path:
- results_2024-04-10T20-19-16.142131.parquet
---
# Dataset Card for Evaluation run of allknowingroger/Synthiallamaguanco-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/Synthiallamaguanco-7B-slerp](https://huggingface.co/allknowingroger/Synthiallamaguanco-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__Synthiallamaguanco-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T20:19:16.142131](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__Synthiallamaguanco-7B-slerp/blob/main/results_2024-04-10T20-19-16.142131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5022519883105377,
"acc_stderr": 0.03435165246376145,
"acc_norm": 0.5073979301587215,
"acc_norm_stderr": 0.03511097439217402,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.455854048100616,
"mc2_stderr": 0.01483939681310818
},
"harness|arc:challenge|25": {
"acc": 0.5102389078498294,
"acc_stderr": 0.014608326906285012,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.01455594976049644
},
"harness|hellaswag|10": {
"acc": 0.5936068512248556,
"acc_stderr": 0.004901558132335522,
"acc_norm": 0.7909778928500298,
"acc_norm_stderr": 0.00405779217189357
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579862,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579862
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101806,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101806
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.028372287797962935,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.028372287797962935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448652,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448652
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073845,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073845
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6844036697247706,
"acc_stderr": 0.01992611751386967,
"acc_norm": 0.6844036697247706,
"acc_norm_stderr": 0.01992611751386967
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.03256685484460388,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.03256685484460388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392923,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392923
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6973180076628352,
"acc_stderr": 0.016428781581749364,
"acc_norm": 0.6973180076628352,
"acc_norm_stderr": 0.016428781581749364
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.026854257928258875,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.026854257928258875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767867,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.02850980780262659,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.02850980780262659
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668777,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668777
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.012337391684530312,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.012337391684530312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5,
"acc_stderr": 0.020227834851568375,
"acc_norm": 0.5,
"acc_norm_stderr": 0.020227834851568375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.455854048100616,
"mc2_stderr": 0.01483939681310818
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
},
"harness|gsm8k|5": {
"acc": 0.18726307808946172,
"acc_stderr": 0.010745914199510815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tfnn/MESHY.AI_363_PLY_Creatures_Labelled | ---
license: cc0-1.0
language:
- en
tags:
- glb
- 3d
- asset
- model
- mesh
- gltf
- meshy
- meshy.ai
- creatures
- humans
- people
- animals
- monsters
- robots
- mythical
- dragons
pretty_name: MESHY.AI 363 PLY Creatures Labelled
size_categories:
- n<1K
---
- [meshy_creatures.zip](https://huggingface.co/datasets/tfnn/MESHY.AI_363_PLY_Creatures_Labelled/resolve/main/meshy_creatures.zip?download=true) - 363 Samples [PLY](https://paulbourke.net/dataformats/ply/) ([Vertex Colored](https://github.com/VertexColor), Labelled).
Refined from [this dataset](https://huggingface.co/datasets/tfnn/MESHY.AI_800_GLB_3D-Assets_Categorised_and_Labelled) but with fixed spelling mistakes and slightly improved labelling this dataset of 363 creatures includes `people`, `animals`, `monsters`, and `robots`. Along with some mythical dragons, mushroom people, frogs, four legged animals with two legs, etc.
These models have been scaled to a unit sphere which is a normalised cubic scale multiplied by 0.55 which means they fit perfectly within a unit sphere.
This dataset makes for good test cases where you just want to prove a network topology works, hastily, on a reasonably decent and small dataset. All models in this dataset are rougly the same category with the biggest deviation being two legged creatures to four legged creatures or bipedal robots with wheels rather than legs.
Each model can range between 16,000 and 45,000 vertices, fairly uniform spacing between each vertex. Models are vertex colored.
Special thanks to [MESHY.AI](https://meshy.ai) for generating the models for this dataset using their online service. |
mapama247/chess_games_caissabase | ---
license: apache-2.0
---
|
yzhuang/autotree_pmlb_100000_clean2_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2364400000
num_examples: 100000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 622819971
dataset_size: 2600840000
---
# Dataset Card for "autotree_pmlb_100000_clean2_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ziyuan111/Urban_Tree_Canopy_in_Durham2 | ---
license: apache-2.0
language:
- en
size_categories:
- 10K<n<100K
---
# Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI)
## Project Overview
The Durham Urban Canopy Analysis and Enhancement Initiative (DUCAEI) is committed to utilizing the Trees & Planting Sites dataset for a comprehensive geospatial analysis of Durham's urban tree canopy. Through Python within Google Colab, our aim is to identify key locations for canopy expansion, evaluate the impact of urban development on green spaces, and deliver informed recommendations for the sustainable growth of urban tree coverage.
## Background and Rationale
Durham's urban tree canopy is a crucial component that contributes to environmental quality, public health, and overall city aesthetics. This canopy is under threat due to ongoing urban development and natural wear. A systematic, data-driven approach is critical for strategic planning and conservation of the urban forest to ensure its vitality for generations to come.
## Data Sources and Methodology
### Data Sources
We will leverage the following files from the Durham Trees & Planting Sites Dataset, as found on the Durham Open Data portal:
- `GS_TreeInventory.shp`
- `Trees_&_Planting_Sites.csv`
- `Trees_%26_Planting_Sites.geojson`
# Dataset Card for Urban Tree Inventory
## Dataset Description
This dataset provides comprehensive information about urban trees within a specified area, including their physical characteristics, environmental benefits, and the economic value they add in terms of ecosystem services.
### Spatial Data (GeoJSON)
**Format:** GeoJSON
**Content:**
- **Type:** `FeatureCollection` - A collection of feature objects.
- **Features:** Each feature object represents a tree and contains:
- **Type:** `Feature`
- **Geometry:** `Point` (includes longitude and latitude of the tree location).
- **Properties:** Detailed information about the tree (some fields may overlap with the CSV structure below).
### Tabular Data (CSV)
**Format:** CSV
**Columns:**
- **X, Y:** Coordinates of the tree location.
- **OBJECTID:** Unique identifier for the tree.
- **streetaddress:** Street address nearest to the tree.
- **city:** City where the tree is located.
- **zipcode:** Zip code for the location of the tree.
- **facilityid:** Identifier for the facility associated with the tree, if any.
- **present:** Indication of whether the tree is currently present.
- **genus, species, commonname:** Botanical and common names of the tree.
- **plantingdate:** Date when the tree was planted.
- **diameterin:** Diameter of the tree trunk in inches.
- **heightft:** Height of the tree in feet.
- **condition:** Health condition of the tree.
- **contractwork:** Indicates if the tree has had any contract work done.
- **neighborhood:** Neighborhood where the tree is located.
- **program:** The program under which the tree was planted.
- **plantingw:** Width of the planting site.
- **plantingcond:** Condition of the planting site.
- **underpwerlins:** Whether the tree is under power lines.
- **matureheight:** The mature height of the tree.
- **GlobalID:** A global unique identifier for the tree.
- **created_user:** The user who created the record.
- **created_date:** The date the record was created.
- **last_edited_user:** The user who last edited the record.
- **last_edited_date:** The date the record was last edited.
#### Environmental and Economic Data:
- **isoprene, monoterpene, vocs:** Emissions and absorption data for various compounds.
- **coremoved_ozperyr, o3removed_ozperyr, etc.:** Annual pollutant removal metrics.
- **o2production_lbperyr:** Annual oxygen production.
- **carbonstorage_lb, carbonstorage_dol:** Carbon storage metrics.
- **grosscarseq_lbperyr, grosscarseq_dolperyr:** Gross carbon sequestration.
- **avoidrunoff_ft2peryr, avoidrunoff_dol2peryr:** Metrics related to stormwater runoff avoidance.
- **totannbenefits_dolperyr:** Total annual dollar benefits from the tree.
- **leafarea_sqft, potevapotran_cuftperyr, etc.:** Metrics related to the water cycle.
- **heating_mbtuperyr, cooling_kwhperyr, etc.:** Energy savings related to the tree's impact on building energy use.
### Example Record
**GeoJSON Feature:**
```json
{
"type": "Feature",
"geometry": {
"type": "Point",
"coordinates": [-78.90863, 36.00441]
},
"properties": {
"OBJECTID": 2840940,
"commonname": "Willow Oak",
// Additional properties...
}
}
```
The `GS_TreeInventory.shp` file encompasses a range of attributes for each record:
- **OBJECTID:** Unique identifier for each record.
- **streetaddr:** Street address where the tree or planting site is located.
- **city:** The city name, which is Durham.
- **zipcode:** Postal code for the location.
- **facilityid:** Identifier possibly linked to a facility or area associated with the tree.
- **present:** Type of feature present, such as a tree or a planting site.
- **genus:** Genus of the tree.
- **species:** Species of the tree.
- **commonname:** Common name of the tree.
- **plantingda:** Date or year range when the tree was planted or the planting site was established.
- ...
### Objectives
1. Combine Shapefile and CSV data into a comprehensive geospatial dataset using Python.
2. Apply Python libraries to uncover relationships between tree canopy data and urban development.
3. Provide practical insights and strategies for the expansion of Durham's urban tree canopy.
4. Produce analyses and visualizations with the GeoJSON file.
### Methodology
Our analytical process within Google Colab will encompass:
- **Data Preparation and Integration:** Using tools like Geopandas, Pandas, and PyShp to organize and combine spatial and tabular data.
- **Geospatial Analysis:** Applying Shapely and Rtree for spatial analysis, and using SciPy or Statsmodels for statistical correlations.
- **Visualization and Optimization:** Generating maps and graphs with Matplotlib, Seaborn, or Plotly, and utilizing optimization algorithms to suggest optimal planting locations.
## Deliverables
1. A collection of Google Colab Python notebooks that outline our analytical processes.
2. Interactive maps and visualizations that connect tree canopy coverage with urban development metrics.
3. An exhaustive report that contains our findings and recommendations for enhancing the urban canopy.
## Limitations
- **Computational Resources:** The limited computational offerings of Google Colab may pose a challenge to the size of the datasets or the complexity of models we can employ.
- **Data Quality:** The accuracy and currency of the data ultimately affect the precision of our recommendations.
- **Sociopolitical Considerations:** Implementation of our data-driven suggestions must be reviewed within the context of local policy and community input.
## Conclusion
DUCAEI aims to create a more verdant and livable urban landscape in Durham through this Python-based analytical project. By laying a strong foundation for data-informed decision-making, we hope to cultivate a thriving, green, and sustainable urban environment. |
ainzOulgun/fqavrR | ---
license: openrail
---
|
ctoraman/deprem-tweet-dataset | ---
license: cc
task_categories:
- text-classification
- token-classification
language:
- tr
tags:
- disaster-relief
- disaster
- earthquake
- tweets
- deprem
- tweet-classification
- ner
size_categories:
- 1K<n<10K
---
Tweets Under the Rubble: Detection of Messages Calling for Help in Earthquake Disaster
The annotated dataset is given at dataset.tsv. We annotate 1,000 tweets in Turkish if tweets call for help (i.e. request rescue, supply, or donation), and their entity tags (person, city, address, status).
Column Name Description
label Human annotation if tweet calls for help (binary classification)
entities Human annotation of entity tags (i.e. person, city, address, and status). The format is [START_INDEX]:[END_INDEX]%[TAG_TYPE].
tweet_id Tweet ID from Twitter API.
Other details can be found at https://github.com/avaapm/deprem
Citation
If you make use of this dataset, please cite following paper.
@misc{toraman2023earthquake,
doi = {10.48550/ARXIV.2302.13403},
url = {https://arxiv.org/abs/2302.13403},
author = {Toraman, Cagri and Kucukkaya, Izzet Emre and Ozcelik, Oguzhan and Sahin, Umitcan},
keywords = {Social and Information Networks (cs.SI), Computation and Language (cs.CL), Information Retrieval (cs.IR), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Tweets Under the Rubble: Detection of Messages Calling for Help in Earthquake Disaster},
publisher = {arXiv},
year = {2023},
copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International}
} |
mpasila/DarkViperAU-Essays | ---
language:
- en
---
This is just the raw text extract of his essays. This will be cleaned up later. |
allenai/multixscience_sparse_max | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
paperswithcode_id: multi-xscience
pretty_name: Multi-XScience
---
This is a copy of the [Multi-XScience](https://huggingface.co/datasets/multi_x_science_sum) dataset, except the input source documents of its `test` split have been replaced by a __sparse__ retriever. The retrieval pipeline used:
- __query__: The `related_work` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits
- __retriever__: BM25 via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"max"`, i.e. the number of documents retrieved, `k`, is set as the maximum number of documents seen across examples in this dataset, in this case `k==20`
Retrieval results on the `train` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.5482 | 0.2243 | 0.0547 | 0.4063 |
Retrieval results on the `validation` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.5476 | 0.2209 | 0.0553 | 0.4026 |
Retrieval results on the `test` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.5480 | 0.2272 | 0.055 | 0.4039 | |
liuyanchen1015/MULTI_VALUE_rte_plural_postposed | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 890889
num_examples: 2389
- name: train
num_bytes: 776257
num_examples: 1989
download_size: 1045708
dataset_size: 1667146
---
# Dataset Card for "MULTI_VALUE_rte_plural_postposed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/charlotte_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of charlotte (Fire Emblem)
This is the dataset of charlotte (Fire Emblem), containing 285 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, bow, blue_eyes, hair_bow, large_breasts, bangs, white_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 285 | 304.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 285 | 183.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 665 | 377.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 285 | 275.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 665 | 516.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/charlotte_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, hetero, solo_focus, 1boy, penis, pussy, nipples, blush, uncensored, vaginal, nude, open_mouth, navel, sex_from_behind, spread_legs, testicles |
| 1 | 18 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, penis, nipples, paizuri, blush, nude, cum_on_breasts, smile, facial, uncensored |
| 2 | 20 |  |  |  |  |  | 1girl, cleavage, solo, smile, navel, midriff, looking_at_viewer, shoulder_armor, bikini_armor, white_background, simple_background, spikes |
| 3 | 11 |  |  |  |  |  | 1girl, solo, huge_breasts, looking_at_viewer, smile, thick_thighs, blush, blunt_bangs, short_shorts, ass, cleavage, simple_background, thighhighs, blue_background, denim_shorts, from_behind, looking_back |
| 4 | 10 |  |  |  |  |  | 1girl, huge_penis, solo, uncensored, blush, erection, nipples, nude, open_mouth, testicles, navel, heart, animal_penis, blunt_bangs, ejaculation, futanari_masturbation, rolling_eyes, spread_legs, thighhighs, tongue, veiny_penis |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | solo_focus | 1boy | penis | pussy | nipples | blush | uncensored | vaginal | nude | open_mouth | navel | sex_from_behind | spread_legs | testicles | paizuri | cum_on_breasts | smile | facial | cleavage | solo | midriff | looking_at_viewer | shoulder_armor | bikini_armor | white_background | simple_background | spikes | huge_breasts | thick_thighs | blunt_bangs | short_shorts | ass | thighhighs | blue_background | denim_shorts | from_behind | looking_back | huge_penis | erection | heart | animal_penis | ejaculation | futanari_masturbation | rolling_eyes | tongue | veiny_penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------|:-------|:--------|:--------|:----------|:--------|:-------------|:----------|:-------|:-------------|:--------|:------------------|:--------------|:------------|:----------|:-----------------|:--------|:---------|:-----------|:-------|:----------|:--------------------|:-----------------|:---------------|:-------------------|:--------------------|:---------|:---------------|:---------------|:--------------|:---------------|:------|:-------------|:------------------|:---------------|:--------------|:---------------|:-------------|:-----------|:--------|:---------------|:--------------|:------------------------|:---------------|:---------|:--------------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 20 |  |  |  |  |  | X | | | | | | | | | | | | X | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | X | | X | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | | | | X | X | X | | X | X | X | | X | X | | | | | | X | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X |
|
CyberHarem/silk_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of silk (Fire Emblem)
This is the dataset of silk (Fire Emblem), containing 107 images and their tags.
The core tags of this character are `blue_hair, short_hair, breasts, blue_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 107 | 92.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silk_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 107 | 62.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silk_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 236 | 118.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silk_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 107 | 85.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silk_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 236 | 149.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silk_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/silk_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 64 |  |  |  |  |  | 1girl, solo, dress, simple_background, smile, looking_at_viewer, veil, long_sleeves, white_background |
| 1 | 11 |  |  |  |  |  | hetero, solo_focus, 1girl, blush, vaginal, large_breasts, nipples, open_mouth, mosaic_censoring, multiple_boys, completely_nude, cum_in_pussy, multiple_penises, spread_legs, 1boy, sweat, double_handjob, gangbang, green_eyes, pregnant, tongue |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | dress | simple_background | smile | looking_at_viewer | veil | long_sleeves | white_background | hetero | solo_focus | blush | vaginal | large_breasts | nipples | open_mouth | mosaic_censoring | multiple_boys | completely_nude | cum_in_pussy | multiple_penises | spread_legs | 1boy | sweat | double_handjob | gangbang | green_eyes | pregnant | tongue |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:--------------------|:-------|:---------------|:-------------------|:---------|:-------------|:--------|:----------|:----------------|:----------|:-------------|:-------------------|:----------------|:------------------|:---------------|:-------------------|:--------------|:-------|:--------|:-----------------|:-----------|:-------------|:-----------|:---------|
| 0 | 64 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
FelixdoingAI/IP2P-adwm-128-5000 | ---
dataset_info:
features:
- name: original_prompt
dtype: string
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_prompt
dtype: string
- name: edited_image
dtype: image
- name: adversarial_image
dtype: image
splits:
- name: train
num_bytes: 574509255.0
num_examples: 5000
download_size: 574446939
dataset_size: 574509255.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/namba_eri_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of namba_eri (THE iDOLM@STER: Cinderella Girls)
This is the dataset of namba_eri (THE iDOLM@STER: Cinderella Girls), containing 32 images and their tags.
The core tags of this character are `brown_hair, short_hair, hair_ornament, hairclip, green_eyes, wavy_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 32 | 20.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/namba_eri_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 32 | 18.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/namba_eri_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 55 | 30.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/namba_eri_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 32 | 20.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/namba_eri_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 55 | 32.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/namba_eri_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/namba_eri_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, card_(medium), character_name, sun_symbol, skirt, open_mouth, orange_background, :d, looking_at_viewer, bow, breasts, grin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | card_(medium) | character_name | sun_symbol | skirt | open_mouth | orange_background | :d | looking_at_viewer | bow | breasts | grin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------|:-----------------|:-------------|:--------|:-------------|:--------------------|:-----|:--------------------|:------|:----------|:-------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-eval-xsum-default-8e4fa8-60494145409 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: morenolq/bart-base-xsum
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: morenolq/bart-base-xsum
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@zuzannad1](https://huggingface.co/zuzannad1) for evaluating this model. |
yzhuang/autotree_automl_heloc_gosdt_l512_d3_sd3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 11682400000
num_examples: 100000
- name: validation
num_bytes: 1168240000
num_examples: 10000
download_size: 1508311822
dataset_size: 12850640000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidreyblanco/distilabel-math-instructions | ---
dataset_info:
features:
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 169849
num_examples: 1840
download_size: 56464
dataset_size: 169849
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-110000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 969978
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mHossain/final_train_v2_90000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 9136964.7
num_examples: 27000
- name: test
num_bytes: 1015218.3
num_examples: 3000
download_size: 4451000
dataset_size: 10152183.0
---
# Dataset Card for "final_train_v2_90000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naorm/desktop-blip-large | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 12745883.0
num_examples: 51
download_size: 12428402
dataset_size: 12745883.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yiyic/eval_clir_mtg | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: mtg_en
num_bytes: 48454
num_examples: 500
- name: mtg_de
num_bytes: 57186
num_examples: 500
- name: mtg_es
num_bytes: 51263
num_examples: 500
- name: mtg_fr
num_bytes: 59140
num_examples: 500
- name: nq_en
num_bytes: 306325
num_examples: 500
- name: en_rt
num_bytes: 49483
num_examples: 500
- name: de_en_multi8_test1
num_bytes: 1308045
num_examples: 1000
- name: de_fr_multi8_test1
num_bytes: 1310347
num_examples: 1000
- name: de_es_multi8_test1
num_bytes: 1310100
num_examples: 1000
- name: en_de_multi8_test1
num_bytes: 1120561
num_examples: 1000
- name: en_fr_multi8_test1
num_bytes: 1122148
num_examples: 1000
- name: en_es_multi8_test1
num_bytes: 1121901
num_examples: 1000
- name: es_en_multi8_test1
num_bytes: 1093157
num_examples: 1000
- name: es_fr_multi8_test1
num_bytes: 1095459
num_examples: 1000
- name: es_de_multi8_test1
num_bytes: 1093872
num_examples: 1000
- name: fr_en_multi8_test1
num_bytes: 1134440
num_examples: 1000
- name: fr_de_multi8_test1
num_bytes: 1135155
num_examples: 1000
- name: fr_es_multi8_test1
num_bytes: 1136495
num_examples: 1000
download_size: 9037857
dataset_size: 14553531
configs:
- config_name: default
data_files:
- split: mtg_en
path: data/mtg_en-*
- split: mtg_de
path: data/mtg_de-*
- split: mtg_es
path: data/mtg_es-*
- split: mtg_fr
path: data/mtg_fr-*
- split: nq_en
path: data/nq_en-*
- split: en_rt
path: data/en_rt-*
- split: de_en_multi8_test1
path: data/de_en_multi8_test1-*
- split: de_fr_multi8_test1
path: data/de_fr_multi8_test1-*
- split: de_es_multi8_test1
path: data/de_es_multi8_test1-*
- split: en_de_multi8_test1
path: data/en_de_multi8_test1-*
- split: en_fr_multi8_test1
path: data/en_fr_multi8_test1-*
- split: en_es_multi8_test1
path: data/en_es_multi8_test1-*
- split: es_en_multi8_test1
path: data/es_en_multi8_test1-*
- split: es_fr_multi8_test1
path: data/es_fr_multi8_test1-*
- split: es_de_multi8_test1
path: data/es_de_multi8_test1-*
- split: fr_en_multi8_test1
path: data/fr_en_multi8_test1-*
- split: fr_de_multi8_test1
path: data/fr_de_multi8_test1-*
- split: fr_es_multi8_test1
path: data/fr_es_multi8_test1-*
---
|
YashaP/tinyscitldr | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
sequence: string
splits:
- name: train
num_bytes: 1368087
num_examples: 1000
download_size: 737186
dataset_size: 1368087
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
foilfoilfoil/LaminiChatML | ---
license: other
---
|
Rasu23/iapp_chosen_reject_orpov1 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: index_column
dtype: int64
- name: text_prompt
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 33360071
num_examples: 5610
download_size: 5948625
dataset_size: 33360071
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SEACrowd/posp | ---
tags:
- pos-tagging
language:
- ind
---
# posp
POSP is a POS Tagging dataset containing 8400 sentences, collected from Indonesian news website with 26 POS tag classes.
The POS tag labels follow the Indonesian Association of Computational Linguistics (INACL) POS Tagging Convention.
POSP dataset is splitted into 3 sets with 6720 train, 840 validation, and 840 test data.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{hoesen2018investigating,
title={Investigating Bi-LSTM and CRF with POS Tag Embedding for Indonesian Named Entity Tagger},
author={Devin Hoesen and Ayu Purwarianti},
booktitle={Proceedings of the 2018 International Conference on Asian Language Processing (IALP)},
pages={35--38},
year={2018},
organization={IEEE}
}
@inproceedings{wilie2020indonlu,
title={IndoNLU: Benchmark and Resources for Evaluating Indonesian Natural Language Understanding},
author={Bryan Wilie and Karissa Vincentio and Genta Indra Winata and Samuel Cahyawijaya and X. Li and Zhi Yuan Lim and S. Soleman and R. Mahendra and Pascale Fung and Syafri Bahar and A. Purwarianti},
booktitle={Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing},
year={2020}
}
```
## License
Creative Common Attribution Share-Alike 4.0 International
## Homepage
[https://github.com/IndoNLP/indonlu](https://github.com/IndoNLP/indonlu)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
tyzhu/lmind_nq_train6000_eval6489_v1_reciteonly_qa_v3 | ---
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 697367
num_examples: 6000
- name: train_ic_qa
num_bytes: 4540536
num_examples: 6000
- name: train_recite_qa
num_bytes: 4546536
num_examples: 6000
- name: eval_qa
num_bytes: 752802
num_examples: 6489
- name: eval_ic_qa
num_bytes: 4906186
num_examples: 6489
- name: eval_recite_qa
num_bytes: 4912675
num_examples: 6489
- name: all_docs
num_bytes: 7126313
num_examples: 10925
- name: all_docs_eval
num_bytes: 7125701
num_examples: 10925
- name: train
num_bytes: 3818906
num_examples: 6000
- name: validation
num_bytes: 4103798
num_examples: 6489
download_size: 26446330
dataset_size: 42530820
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CyberHarem/rosmontis_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of rosmontis/ロスモンティス/迷迭香 (Arknights)
This is the dataset of rosmontis/ロスモンティス/迷迭香 (Arknights), containing 500 images and their tags.
The core tags of this character are `animal_ears, cat_ears, long_hair, cat_girl, animal_ear_fluff, green_eyes, grey_hair, white_hair, tail, cat_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 872.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosmontis_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 425.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosmontis_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1280 | 955.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosmontis_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 732.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rosmontis_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1280 | 1.44 GiB | [Download](https://huggingface.co/datasets/CyberHarem/rosmontis_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rosmontis_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, hood_down, jacket, solo, white_dress, holding_book, infection_monitor_(arknights), looking_at_viewer, closed_mouth, earpiece, hooded_cloak |
| 1 | 6 |  |  |  |  |  | 1girl, black_footwear, black_gloves, closed_mouth, fingerless_gloves, full_body, solo, white_dress, boots, looking_at_viewer, earpiece, hood_down, hooded_cloak, infection_monitor_(arknights), thigh_strap |
| 2 | 10 |  |  |  |  |  | 1girl, hood_down, looking_at_viewer, solo, upper_body, jacket, blush, simple_background, white_background, closed_mouth |
| 3 | 13 |  |  |  |  |  | 1girl, collar, looking_at_viewer, sleeveless_dress, solo, white_dress, bare_shoulders, official_alternate_costume, barefoot, twintails, closed_mouth, full_body, very_long_hair, bare_legs, feet, mechanical_wings, toes, white_background, colored_eyelashes, simple_background, sitting |
| 4 | 10 |  |  |  |  |  | 1boy, 1girl, blush, hetero, sex, nipples, sweat, vaginal, penis, solo_focus, open_mouth, small_breasts, black_gloves, fingerless_gloves, spread_legs, thigh_strap, dress, mosaic_censoring, navel, straddling, closed_eyes, coat, cum_in_pussy, girl_on_top, looking_at_viewer, missionary, on_back, pov |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | fingerless_gloves | hood_down | jacket | solo | white_dress | holding_book | infection_monitor_(arknights) | looking_at_viewer | closed_mouth | earpiece | hooded_cloak | black_footwear | full_body | boots | thigh_strap | upper_body | blush | simple_background | white_background | collar | sleeveless_dress | bare_shoulders | official_alternate_costume | barefoot | twintails | very_long_hair | bare_legs | feet | mechanical_wings | toes | colored_eyelashes | sitting | 1boy | hetero | sex | nipples | sweat | vaginal | penis | solo_focus | open_mouth | small_breasts | spread_legs | dress | mosaic_censoring | navel | straddling | closed_eyes | coat | cum_in_pussy | girl_on_top | missionary | on_back | pov |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:------------|:---------|:-------|:--------------|:---------------|:--------------------------------|:--------------------|:---------------|:-----------|:---------------|:-----------------|:------------|:--------|:--------------|:-------------|:--------|:--------------------|:-------------------|:---------|:-------------------|:-----------------|:-----------------------------|:-----------|:------------|:-----------------|:------------|:-------|:-------------------|:-------|:--------------------|:----------|:-------|:---------|:------|:----------|:--------|:----------|:--------|:-------------|:-------------|:----------------|:--------------|:--------|:-------------------|:--------|:-------------|:--------------|:-------|:---------------|:--------------|:-------------|:----------|:------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | | X | X | X | | | | X | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | | | | | X | X | | | X | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | X | | | | | | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
satyambarnwal/cartoonizer-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 31656230.0
num_examples: 50
download_size: 31658428
dataset_size: 31656230.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
asun17904/wiki2017_bank_examples | ---
dataset_info:
features:
- name: section_texts
sequence: string
- name: section_titles
sequence: string
- name: title
dtype: string
- name: soi
sequence: string
- name: other
sequence: string
splits:
- name: train
num_bytes: 3467283652
num_examples: 163992
download_size: 0
dataset_size: 3467283652
---
# Dataset Card for "wiki2017_bank_examples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/companies_azaria_mitchell | ---
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 80207.2
num_examples: 960
- name: test
num_bytes: 20051.8
num_examples: 240
download_size: 34846
dataset_size: 100259.0
---
# Dataset Card for "companies_azaria_mitchell"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.