datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Falah/food102-iraqi-rice-meal | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheese_plate
'17': cheesecake
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': rice_meal
'85': risotto
'86': samosa
'87': sashimi
'88': scallops
'89': seaweed_salad
'90': shrimp_and_grits
'91': spaghetti_bolognese
'92': spaghetti_carbonara
'93': spring_rolls
'94': steak
'95': strawberry_shortcake
'96': sushi
'97': tacos
'98': takoyaki
'99': tiramisu
'100': tuna_tartare
'101': waffles
splits:
- name: train
num_bytes: 4881528176.3
num_examples: 101100
download_size: 5108984474
dataset_size: 4881528176.3
license: apache-2.0
task_categories:
- image-classification
language:
- en
pretty_name: food101+Iraqi-rice-meal
size_categories:
- 100K<n<1M
extra_gated_prompt: "You agree to not attempt to determine the identity of individuals in this dataset"
extra_gated_fields:
Name: text
Country: text
Email: text
I agree to use this model for non-commercial use ONLY: checkbox
---
## Dataset Card for Food-102 (Food101+Iraqi-rice-male )
Dataset Name: Food-102
Dataset Summary:
Food-102 is an updated version of the Food-101 dataset, now expanded to include 102 food categories. It consists of a total of 102,000 images, with 750 training images and 250 manually reviewed test images provided for each category. The dataset aims to enable food classification tasks and provide a diverse range of food images for research and development purposes. The training images in Food-102 have intentionally not been cleaned, allowing for some level of noise, such as intense colors and occasional mislabeled images. All images in the dataset have been rescaled to have a maximum side length of 512 pixels.
## Additional Information:
- Number of Categories: 102
- Total Images: 101,100
- Training Images per Category: 75825
- Test Images per Category: 25275
- Image Noise: The training images may contain some noise, including intense colors and occasional mislabeled images.
- Image Rescaling: All images in the dataset have been resized to have a maximum side length of 512 pixels.
## Note:
The newly added category "Iraqi rice male food" is not specifically mentioned as part of the Food-101 dataset. If you require further details or have any specific questions about the dataset, please let me know. |
cmu-mlsp/librispeech960-wavlm-large-km1000_asr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: validation_other
path: data/validation_other-*
- split: test_other
path: data/test_other-*
dataset_info:
features:
- name: text
dtype: string
- name: audio_codes
sequence: string
- name: id
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
splits:
- name: train
num_bytes: 1246247156
num_examples: 281241
- name: validation
num_bytes: 7052458
num_examples: 2703
- name: test
num_bytes: 7062964
num_examples: 2620
- name: validation_other
num_bytes: 6706447
num_examples: 2864
- name: test_other
num_bytes: 6987808
num_examples: 2939
download_size: 254541270
dataset_size: 1274056833
---
# Dataset Card for "librispeech960-wavlm-large-km1000_asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_80_1713219503 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 158279
num_examples: 379
download_size: 88333
dataset_size: 158279
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BuroIdentidadDigital/pasaporte_Mex | ---
license: c-uda
---
|
CyberHarem/gitano_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gitano/ギターノ/远山 (Arknights)
This is the dataset of gitano/ギターノ/远山 (Arknights), containing 43 images and their tags.
The core tags of this character are `animal_ears, long_hair, breasts, green_eyes, very_long_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 75.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gitano_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 43 | 63.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gitano_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 116 | 126.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gitano_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gitano_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, cleavage, holding_card, simple_background, white_background, bracelet, collarbone, long_sleeves, medium_breasts, official_alternate_costume, ponytail, red_jacket, sunglasses, upper_body, choker, earrings, eyewear_on_head, grey_hair, hand_up, mole_under_mouth, open_jacket, parted_lips, white_hair, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | cleavage | holding_card | simple_background | white_background | bracelet | collarbone | long_sleeves | medium_breasts | official_alternate_costume | ponytail | red_jacket | sunglasses | upper_body | choker | earrings | eyewear_on_head | grey_hair | hand_up | mole_under_mouth | open_jacket | parted_lips | white_hair | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-----------|:---------------|:--------------------|:-------------------|:-----------|:-------------|:---------------|:-----------------|:-----------------------------|:-----------|:-------------|:-------------|:-------------|:---------|:-----------|:------------------|:------------|:----------|:-------------------|:--------------|:--------------|:-------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/kasodani_kyouko_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kasodani_kyouko/幽谷響子 (Touhou)
This is the dataset of kasodani_kyouko/幽谷響子 (Touhou), containing 500 images and their tags.
The core tags of this character are `green_hair, short_hair, animal_ears, green_eyes, dog_ears, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 462.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasodani_kyouko_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 324.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasodani_kyouko_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1062 | 627.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasodani_kyouko_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 434.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasodani_kyouko_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1062 | 794.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasodani_kyouko_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kasodani_kyouko_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, open_mouth, smile, solo, bamboo_broom, dress, fang, blush |
| 1 | 6 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, simple_background, solo, bamboo_broom, blush, holding_broom, white_background, :d, open_mouth, pink_dress |
| 2 | 18 |  |  |  |  |  | 1girl, full_body, solo, white_socks, black_footwear, holding_broom, long_sleeves, open_mouth, pink_dress, shoes, looking_at_viewer, smile, bamboo_broom, simple_background, standing, white_background |
| 3 | 7 |  |  |  |  |  | 1girl, holding_broom, long_sleeves, pink_dress, solo, blush, looking_at_viewer, bangs, upper_body, hair_between_eyes, smile, bamboo_broom, closed_mouth, open_mouth |
| 4 | 5 |  |  |  |  |  | 1girl, bangs, full_body, long_sleeves, simple_background, solo, standing, white_background, white_socks, black_footwear, hair_between_eyes, open_mouth, pink_dress, shoes, :d, blush, dog_tail, looking_at_viewer, skin_fang, ahoge, brown_dress |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, open_mouth, solo_focus, cum_in_pussy, looking_at_viewer, navel, penis, sex, small_breasts, vaginal, collarbone, dog_tail, spread_legs, bar_censor, bikini_bottom_aside, medium_breasts, on_back, smile, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | smile | solo | bamboo_broom | dress | fang | blush | long_sleeves | looking_at_viewer | simple_background | holding_broom | white_background | :d | pink_dress | full_body | white_socks | black_footwear | shoes | standing | bangs | upper_body | hair_between_eyes | closed_mouth | dog_tail | skin_fang | ahoge | brown_dress | 1boy | hetero | nipples | solo_focus | cum_in_pussy | navel | penis | sex | small_breasts | vaginal | collarbone | spread_legs | bar_censor | bikini_bottom_aside | medium_breasts | on_back | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------|:-------|:---------------|:--------|:-------|:--------|:---------------|:--------------------|:--------------------|:----------------|:-------------------|:-----|:-------------|:------------|:--------------|:-----------------|:--------|:-----------|:--------|:-------------|:--------------------|:---------------|:-----------|:------------|:--------|:--------------|:-------|:---------|:----------|:-------------|:---------------|:--------|:--------|:------|:----------------|:----------|:-------------|:--------------|:-------------|:----------------------|:-----------------|:----------|:--------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | X | X | X | X | X | | | | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | | | X | X | X | | X | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | | | | X | | X | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
anonimoh656r7r65/brug | ---
license: openrail
language:
- en
pretty_name: matte
---
<audio controls src="https://cdn-uploads.huggingface.co/production/uploads/64f9df86686db7e7d0cd862b/-dXpk-T68-hlUrc7x9AdA.mpga"></audio>
|
davanstrien/autotrain-data-imagein-hand | Invalid username or password. |
mbazaNLP/common-voice-kinyarwanda-english-dataset | ---
language:
- rw
- en
license:
- cc-by-4.0
size_categories:
- ~ 3000 hours
- 721398 clips
---
# Kinyarwanda-English Commonvoice dataset
A compilation of Kinyarwanda-english dataset to be used to train multi-lingual ASR
**Note:** The audio dataset shall be added in the future |
A-Bar/nl-de_top_cs_train | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 438908013
num_examples: 1000000
download_size: 181261442
dataset_size: 438908013
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/gov_trec-web-2004 | ---
pretty_name: '`gov/trec-web-2004`'
viewer: false
source_datasets: ['irds/gov']
task_categories:
- text-retrieval
---
# Dataset Card for `gov/trec-web-2004`
The `gov/trec-web-2004` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/gov#gov/trec-web-2004).
# Data
This dataset provides:
- `queries` (i.e., topics); count=225
- `qrels`: (relevance assessments); count=88,566
- For `docs`, use [`irds/gov`](https://huggingface.co/datasets/irds/gov)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/gov_trec-web-2004', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/gov_trec-web-2004', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Craswell2004TrecWeb,
title={Overview of the TREC-2004 Web Track},
author={Nick Craswell and David Hawking},
booktitle={TREC},
year={2004}
}
```
|
Voxlab/Synthetic-Spoken-Digit-Dataset | ---
license: mpl-2.0
task_categories:
- conversational
- translation
- audio-classification
- automatic-speech-recognition
- text-to-speech
language:
- en
- es
- fr
- ko
- de
- it
- pt
- ru
- zh
- ja
---
# Synthetic Generated Free Spoken Digit Dataset
*This dataset is Generated by [Voxlab](https://voxlab.netlify.app/)*
#### Context
This dataset is generated by a Text to Speech Models (TTS). It contains spoken digits from 0 to 9.
This is a free to use for research as well as for commercial use.
Dataset contains simple audio files consisting of recordings of spoken digits in wav file.
### Current Status
**Languages** : 10
(English - en, Spanish - es, French - fr, Korean - ko, German - de, Italian - it, Portuguese - pt, Russian - ru, Chinese - zh-cn, Japanese - ja)
**Speakers** : for each language there is only 1 speaker
**No of audio files** : 5000 audio files (50 for each digit)
**Pronunciations** : This dataset contains pronunciations in all 10 languages
### Dataset File structure
Files are named in the following format: *{digitLabel}-{language}-{speakerGender}-{index}.wav*
*Example: seven-en-F-56.wav*
### How to use
```
from datasets import load_dataset
dataset = load_dataset("Voxlab/synthetic-generated-free-spoken-digit-dataset")
```
#### Inspiration
This dataset was inspired from free spoken digit dataset (https://www.kaggle.com/datasets/joserzapata/free-spoken-digit-dataset-fsdd)
Explore similar datasets generated by [Voxlab](voxlab.netlify.app)
https://github.com/synthetic-data-platform/Free-Synthetic-Datasets |
dllllb/alfa-scoring-trx | ---
task_categories:
- tabular-classification
tags:
- finance
pretty_name: Alfa Battle 2.0 contest scoring task
configs:
- config_name: train_transactions
data_files: train_transactions/*.parquet
- config_name: test_transactions
data_files: test_transactions/*.parquet
- config_name: train_target
data_files: train_target.csv.gz
- config_name: test_target
data_files: test_target.csv.gz
---
https://ods.ai/competitions/dl-fintech-card-transactions
https://boosters.pro/championship/alfabattle2 |
KnutJaegersberg/facehugger | ---
license: cc-by-nc-4.0
---
|
nampdn-ai/tiny-codes | ---
license: mit
task_categories:
- text-generation
language:
- en
pretty_name: Tiny Codes
size_categories:
- 1M<n<10M
---
# Reasoning with Language and Code
This synthetic dataset is a collection of **1.6 millions short and clear code snippets** that can help LLM models learn how to reason with both natural and programming languages. The dataset covers a wide range of programming languages, such as Python, TypeScript, JavaScript, Ruby, Julia, Rust, C++, Bash, Java, C#, and Go. It also includes two database languages: Cypher (for graph databases) and SQL (for relational databases) in order to study the relationship of entities.
The main goal of this repository is to highlight the importance of **textbook (high education value)** using **code snippets**. All code snippets are carefully written and commented to ensure maximum readability and understandability. Moreover, the use of **if/else control flow** is emphasized to foster the development of effective reasoning skills in LLM models.
This repository is inspired by the paper [Textbooks Are All You Need](https://arxiv.org/abs/2306.11644) and [The Magic of IF](https://aclanthology.org/2023.findings-acl.574.pdf), which shows that LLM models can achieve state-of-the-art results on code-related tasks by training on high-quality data that resembles textbooks and exercises. This repository aims to provide such data for data analysts and ML engineers who want to enhance their knowledge of how LLM models can learn to reason with code. Anyone who wants to reproduce this dataset can use these prompts with other LLM models and compare their results, or you can forge a new prompt from related properties.
*Please note that this dataset is not intended for code-generation purposes, it's intended to boost the reasoning capability of model via logic code.*
I hope you find this dataset useful and informative!
## Tiny Series
Explore the possibilities and limitations of building Small Language Models with these tiny gems of data!
- [TinyStories](https://arxiv.org/abs/2305.07759): The paper that sparked my interest in the journey of the tiny-* series.
- [tiny-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-textbooks): 420k "things of internet" synthetic textbooks.
- [tiny-orca-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-orca-textbooks): Synthetic textbook to help model learn in-context on how it should perform task the right way.
- [tiny-webtext](https://huggingface.co/datasets/nampdn-ai/tiny-webtext): A 6GB (4.5M records) variety of diverse webtext enriched with critical thinking methods to make unbiased English dataset.
- [tiny-lessons](https://huggingface.co/datasets/nampdn-ai/tiny-lessons): Subset of [tiny-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-textbooks) dataset, various lessons about "things of internet" augmented in a bite-sized textbook Markdown format.
- [tiny-bridgedict](https://huggingface.co/datasets/nampdn-ai/tiny-bridgedict): A dataset that links and transfers knowledge between English, Vietnamese, Chinese in a tiny multilingual models.
### Others small HQ datasets with textbook-like quality
- [devdocs.io](https://huggingface.co/datasets/nampdn-ai/devdocs.io): FreeCodeCamp has provided 189k comprehensive API documentation across a wide range of tech stacks and programming languages.
- [sciphi-python-textbook](https://huggingface.co/datasets/emrgnt-cmplxty/sciphi-python-textbook)
- [textbook_quality_programming](https://huggingface.co/datasets/vikp/textbook_quality_programming)
- [sciphi-textbooks-are-all-you-need](https://huggingface.co/datasets/emrgnt-cmplxty/sciphi-textbooks-are-all-you-need) |
sebarodri12/JennyRodmin | ---
tags:
- JennyRodmin
- Ecuadorian Woman
---
JennyRodmin imagesn |
turkish-nlp-suite/vitamins-supplements-NER | ---
language:
- tr
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: Vitamins and Supplements NER Dataset
---
# Dataset Card for turkish-nlp-suite/vitamins-supplements-NER
<img src="https://raw.githubusercontent.com/turkish-nlp-suite/.github/main/profile/supplementsNER.png" width="20%" height="20%">
### Dataset Description
- **Repository:** [Vitamins and Supplements NER Dataset](https://github.com/turkish-nlp-suite/Vitamins-Supplements-NER-dataset)
- **Paper:** [ACL link](https://aclanthology.org/2023.acl-long.768/)
- **Dataset:** Vitamins and Supplements NER Dataset
- **Domain:** E-commerce, customer reviews, medical
### Dataset Summary
The Vitamins and Supplements NER Dataset is a NER dataset containing customer reviews with entity and span annotations. User reviews were collected from a popular supplement products e-
commerce website Vitaminler.com.
Each customer review in the Vitamins and Supplements NER Dataset describes a customer’s experience with a supplement product in terms of that product’s effectiveness, side effects, taste and
smell, as well as comments on supplement usage frequency and dosage, active ingredients, brand, and similar products by other brands. An example review from the dataset with
entity and span annotations looks like this:
<img src="https://raw.githubusercontent.com/turkish-nlp-suite/.github/main/profile/positiv1.png" width="80%" height="80%">
The customer praises a biotin supplement; in their review they stated that they suffer from Thyroiditis and as a result they're experiencing hair loss. They purchased the biotin product to
prevent the hair fall and they described the effectiveness of the product as "their hair loss reduced noticably". Visual is created by displaCy.
## Tagset
For this dataset we annotated both entities and spans. Span annotations are common in medical NLP datasets, spans capture the information about "what happens with the entity", i.e. more semantics about the entities in the text.
NER tags and their distribution are in the dataset are as follows:
| Tag | Count |
|---|---|
| Disease | 1.875 |
| Biomolecule | 859 |
| User | 634 |
| Other_product | 543 |
| Recommender | 436 |
| Dosage | 471 |
| Brand | 275 |
| User_demographics | 192 |
| Ingredient | 175 |
| Other_brand | 121 |
Distribution of span tags:
| Tag | Count |
|---|---|
| Effect | 2.562 |
| Side_effect | 608 |
| Taste_smell | 558 |
| Health_complaints | 858 |
All annotations are done by [Co-one](https://co-one.co/). many thanks to them for their contributions.
### Dataset Instances
The dataset includes around 2.5K annotated reviews with annotations.
Each dataset instance contains
- customer review text
- entities and spans annotated
Here's an example for you:
```
{
"text": "Bu zamana kadar kullandığım en iyi B12 takviyesi. Doktorum saç dökülmem için verdi ama aç karnına dil altına bir fıs kullanınca KABIZLIK sorunumu çözdü. çok mutlu oldum. Indirimde gördüğünüz an kaçırmayın derim."
"spans": [
{ "val": "saç dökülmem", "label": "HASTALIK", "start": 59, "end": 71 },
{ "val": " KABIZLIK", "label": "HASTALIK", "start": 127, "end": 136 },
{ "val": "B12", "label": "BİYOMOLEKÜL", "start": 35, "end": 38 },
{ "val": " Doktorum", "label": "TAVSİYE_EDEN", "start": 49, "end": 58 },
{ "val": "bir fıs", "label": "DOZ", "start": 109, "end": 116 }
]
}
```
If you're rather interested in a big JSON, you can find the dataset as a single JSON in dataset's [Github repo](https://github.com/turkish-nlp-suite/Vitamins-Supplements-NER-Dataset).
### Data Split
| name |train|validation|test|
|---------|----:|---:|---:|
|Vitamins and Supplements NER Dataset|2072|200|200|
### Citation
This work is supported by Google Developer Experts Program. Part of Duygu 2022 Fall-Winter collection, "Turkish NLP with Duygu"/ "Duygu'yla Türkçe NLP". All rights reserved. If you'd like to use this dataset in your own work, please kindly cite [A Diverse Set of Freely Available Linguistic Resources for Turkish](https://aclanthology.org/2023.acl-long.768/) :
```
@inproceedings{altinok-2023-diverse,
title = "A Diverse Set of Freely Available Linguistic Resources for {T}urkish",
author = "Altinok, Duygu",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.768",
pages = "13739--13750",
abstract = "This study presents a diverse set of freely available linguistic resources for Turkish natural language processing, including corpora, pretrained models and education material. Although Turkish is spoken by a sizeable population of over 80 million people, Turkish linguistic resources for natural language processing remain scarce. In this study, we provide corpora to allow practitioners to build their own applications and pretrained models that would assist industry researchers in creating quick prototypes. The provided corpora include named entity recognition datasets of diverse genres, including Wikipedia articles and supplement products customer reviews. In addition, crawling e-commerce and movie reviews websites, we compiled several sentiment analysis datasets of different genres. Our linguistic resources for Turkish also include pretrained spaCy language models. To the best of our knowledge, our models are the first spaCy models trained for the Turkish language. Finally, we provide various types of education material, such as video tutorials and code examples, that can support the interested audience on practicing Turkish NLP. The advantages of our linguistic resources are three-fold: they are freely available, they are first of their kind, and they are easy to use in a broad range of implementations. Along with a thorough description of the resource creation process, we also explain the position of our resources in the Turkish NLP world.",
}
```
|
Ochkaron/writing | ---
license: apache-2.0
---
|
fewshot-goes-multilingual/cs_squad-3.0 | ---
annotations_creators:
- crowdsourced
language:
- cs
language_creators:
- crowdsourced
license:
- lgpl-3.0
multilinguality:
- monolingual
pretty_name: Czech Simple Question Answering Dataset
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- czech QA
- wikipedia QA
task_categories:
- question-answering
task_ids:
- extractive-qa
---
# Dataset Card for Czech Simple Question Answering Dataset 3.0
This a processed and filtered adaptation of an existing dataset. For raw and larger dataset, see `Dataset Source` section.
## Dataset Description
The data contains questions and answers based on Czech wikipeadia articles.
Each question has an answer (or more) and a selected part of the context as the evidence.
A majority of the answers are extractive - i.e. they are present in the context in the exact form. The remaining cases are
- yes/no questions
- answer is almost in the exact form present in the text, but the form of words was changed to suit the question (declension, ...)
- answered in own words (should be rare, but is not)
All questions in the dataset are answerable from the context. Small minority of questions have multiple answers.
Sometimes it means that any of them is correct (e.g. either "Pacifik" or "Tichý oceán" are correct terms for Pacific Ocean)
and sometimes it means that all of them together are a correct answer (e.g., Who was Leonardo da Vinci? ["painter", "engineer"])
Total number of examples is around:
- 6,250 in train
- 570 in validation
- 850 in test.
## Dataset Features
Each example contains:
- `item_id`: string id of the
- `context`: "reasonably" big chunk (string) of wikipedia article that contains the answer
- `question`: string
- `answers`: list of all answers (string). mostly list of length 1
- `evidence_text`: substring of context (typically one sentence) that is sufficient to answer the question
- `evidence_start`: index in context, such that `context[evidence_start:evidence_end] == evidence_text`
- `evidence_end`: index in context
- `occurences`:
list of (dictionaries) occurences of the answer(s) in the evidence.
Each answer was searched with word boundaries ("\b" in regex) and case-sensitive in the evidence.
If nothing found, try again but case-insensitive.
If nothing found, try again but case-sensitive without word boundaries.
If nothing found, try again but case-insensitive without word boundaries.
This process should supress "false positive" occurences of the answer in the evidence.
- `start`: index in context
- `end`: index in context
- `text`: the answer looked for
- `url`: link to the wikipedia article
- `original_article`: original parsed wikipedia article from which the context is taken
- `question_type`: type of the question, one of: ['ABBREVIATION', 'DATETIME', 'DENOTATION', 'ENTITY', 'LOCATION', 'NUMERIC', 'ORGANIZATION', 'OTHER', 'PERSON', 'YES_NO']
- `answer_type`: type of the answer, one of: ['ABBREVIATION', 'ADJ_PHRASE', 'CLAUSE', 'DATETIME', 'ENTITY', 'LOCATION', 'NUMERIC', 'OTHER', 'PERSON', 'VERB_PHRASE']
## Dataset Source
The dataset is a preprocessed adaptation of existing SQAD 3.0 dataset [link to data](https://lindat.cz/repository/xmlui/handle/11234/1-3069).
This adaptation contains (almost) same data, but converted to a convenient format.
The data was also filtered to remove a statistical bias where the answer was contained
in the first sentence in the article (around 50% of all data in the original dataset, likely
caused by the data collection process).
## Citation
Cite authors of the [original dataset](https://lindat.cz/repository/xmlui/handle/11234/1-3069):
```bibtex
@misc{11234/1-3069,
title = {sqad 3.0},
author = {Medve{\v d}, Marek and Hor{\'a}k, Ale{\v s}},
url = {http://hdl.handle.net/11234/1-3069},
note = {{LINDAT}/{CLARIAH}-{CZ} digital library at the Institute of Formal and Applied Linguistics ({{\'U}FAL}), Faculty of Mathematics and Physics, Charles University},
copyright = {{GNU} Library or "Lesser" General Public License 3.0 ({LGPL}-3.0)},
year = {2019}
}
```
|
tracywong117/NCBI-Taxonomy | ---
license: mit
language:
- en
tags:
- biology
---
The data is retrieved from NCBI Taxonomy on 1 Feb 2024. Please refer to my [GitHub](https://github.com/tracywong117/NCBI-get-all-children-organism-under-ancestor) for the detail of data extraction. |
hugfaceguy0001/LightNovels120kto150k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 140078399
num_examples: 474
download_size: 88310840
dataset_size: 140078399
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cahya/instructions-ja | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 27079870.672446616
num_examples: 55389
- name: test
num_bytes: 712821.1637766915
num_examples: 1458
- name: validation
num_bytes: 712821.1637766915
num_examples: 1458
download_size: 14983193
dataset_size: 28505513.0
---
# Dataset Card for "instructions-ja"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yuchong/us-vessel | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1182963.0
num_examples: 4
download_size: 185605
dataset_size: 1182963.0
---
# Dataset Card for "us-vessel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RandyHuynh5815/TACO_Test_Reformatted | ---
dataset_info:
features:
- name: image
dtype: image
- name: categories
sequence: int8
splits:
- name: train
num_bytes: 2720258641.5
num_examples: 1500
download_size: 2621965640
dataset_size: 2720258641.5
---
# Dataset Card for "TACO_Test_Reformatted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hanazuki_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hanazuki/花月/花月 (Azur Lane)
This is the dataset of hanazuki/花月/花月 (Azur Lane), containing 127 images and their tags.
The core tags of this character are `pink_hair, animal_ears, long_hair, green_eyes, fox_ears, hair_ornament, hairband, fox_girl, hair_flower, breasts, tail, fox_tail, bangs, animal_ear_fluff, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 127 | 220.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanazuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 127 | 117.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanazuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 327 | 260.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanazuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 127 | 191.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanazuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 327 | 382.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hanazuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hanazuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, detached_sleeves, flower, looking_at_viewer, oil-paper_umbrella, solo, blush, cherry_blossoms, holding_umbrella, smile, white_kimono, open_mouth |
| 1 | 15 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, looking_at_viewer, solo, blush, detached_sleeves, flower, oil-paper_umbrella, smile, white_kimono, wide_sleeves, holding_umbrella, obi, cherry_blossoms, closed_mouth, long_sleeves, sleeveless_kimono, no_panties, very_long_hair, groin, petals, sideboob |
| 2 | 15 |  |  |  |  |  | 1girl, bare_shoulders, flower, looking_at_viewer, solo, official_alternate_costume, china_dress, clothing_cutout, pelvic_curtain, cleavage, white_dress, red_gloves, sleeveless_dress, white_thighhighs, feather_boa, pink_gloves, holding, medium_breasts, open_mouth, pink_hairband, gold_trim, simple_background, sitting, smile, very_long_hair, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | detached_sleeves | flower | looking_at_viewer | oil-paper_umbrella | solo | blush | cherry_blossoms | holding_umbrella | smile | white_kimono | open_mouth | wide_sleeves | obi | closed_mouth | long_sleeves | sleeveless_kimono | no_panties | very_long_hair | groin | petals | sideboob | official_alternate_costume | china_dress | clothing_cutout | pelvic_curtain | cleavage | white_dress | red_gloves | sleeveless_dress | white_thighhighs | feather_boa | pink_gloves | holding | medium_breasts | pink_hairband | gold_trim | simple_background | sitting | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:-------------------|:---------|:--------------------|:---------------------|:-------|:--------|:------------------|:-------------------|:--------|:---------------|:-------------|:---------------|:------|:---------------|:---------------|:--------------------|:-------------|:-----------------|:--------|:---------|:-----------|:-----------------------------|:--------------|:------------------|:-----------------|:-----------|:--------------|:-------------|:-------------------|:-------------------|:--------------|:--------------|:----------|:-----------------|:----------------|:------------|:--------------------|:----------|:-------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | | | X | X | | X | | | | X | | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
TechxGenus/LeetCode-Contest | ---
license: other
license_name: deepseek
license_link: >-
https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-instruct/blob/main/LICENSE
task_categories:
- text-generation
language:
- en
tags:
- code
---
## LeetCode Contest Benchmark
A new benchmark for evaluating Code LLMs proposed by [DeepSeek-Coder](https://arxiv.org/abs/2401.14196), which consists of the latest algorithm problems of different difficulties.
## Usage
```
git clone https://github.com/deepseek-ai/DeepSeek-Coder.git
cd Evaluation/LeetCode
# Set the model or path here
MODEL="deepseek-ai/deepseek-coder-7b-instruct"
python vllm_inference.py --model_name_or_path $MODEL --saved_path output/20240121-Jul.deepseek-coder-7b-instruct.jsonl
python evaluate_leetcode.py --generation_path output/20240121-Jul.deepseek-coder-7b-instruct.jsonl --result_path output/20240121-Jul.deepseek-coder-7b-instruct.result.jsonl
```
### Citation
```
@article{guo2024deepseekcoder,
title = {DeepSeek-Coder: When the Large Language Model Meets Programming - The Rise of Code Intelligence},
author = {Daya Guo and Qihao Zhu and Dejian Yang and Zhenda Xie and Kai Dong and Wentao Zhang and Guanting Chen and Xiao Bi and Y. Wu and Y. K. Li and Fuli Luo and Yingfei Xiong and Wenfeng Liang},
year = {2024},
journal = {arXiv preprint arXiv: 2401.14196}
}
```
|
joey234/mmlu-professional_law-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 11945
num_examples: 5
download_size: 27309
dataset_size: 11945
---
# Dataset Card for "mmlu-professional_law-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shuaishuaicdp/MLLM-Judge | ---
license: mit
---
|
milkshake721/stem-wiki-cohere-no-emb | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_facebook__opt-66b | ---
pretty_name: Evaluation run of facebook/opt-66b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [facebook/opt-66b](https://huggingface.co/facebook/opt-66b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 122 configuration, each one coresponding to one of\
\ the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can\
\ be found as a specific split in each configuration, the split being named using\
\ the timestamp of the run.The \"train\" split is always pointing to the latest\
\ results.\n\nAn additional configuration \"results\" store all the aggregated results\
\ of the run (and is used to compute and display the aggregated metrics on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_facebook__opt-66b\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T00:30:57.404111](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-66b/blob/main/results_2023-12-03T00-30-57.404111.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.016679302501895376,\n\
\ \"acc_stderr\": 0.0035275958887224556\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.0035275958887224556\n\
\ }\n}\n```"
repo_url: https://huggingface.co/facebook/opt-66b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|arc:challenge|25_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|arc:challenge|25_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_09T17_37_15.988083
path:
- '**/details_harness|drop|3_2023-09-09T17-37-15.988083.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-09T17-37-15.988083.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_09T17_37_15.988083
path:
- '**/details_harness|gsm8k|5_2023-09-09T17-37-15.988083.parquet'
- split: 2023_12_03T00_30_57.404111
path:
- '**/details_harness|gsm8k|5_2023-12-03T00-30-57.404111.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T00-30-57.404111.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hellaswag|10_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hellaswag|10_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:07:59.118983.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T00:29:23.220857.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T18:07:59.118983.parquet'
- split: 2023_08_24T00_29_23.220857
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T00:29:23.220857.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-24T00:29:23.220857.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_09T17_37_15.988083
path:
- '**/details_harness|winogrande|5_2023-09-09T17-37-15.988083.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-09T17-37-15.988083.parquet'
- config_name: original_mmlu_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:international_law|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:management|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:marketing|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:sociology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:virology|5_2023-08-28T21:15:14.969062.parquet'
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_abstract_algebra_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:abstract_algebra|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_anatomy_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:anatomy|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_astronomy_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:astronomy|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_business_ethics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:business_ethics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_clinical_knowledge_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:clinical_knowledge|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_college_biology_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_biology|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_college_chemistry_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_college_computer_science_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_college_mathematics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_college_medicine_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_medicine|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_college_physics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:college_physics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_computer_security_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:computer_security|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_conceptual_physics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:conceptual_physics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_econometrics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:econometrics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_electrical_engineering_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:electrical_engineering|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_elementary_mathematics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:elementary_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_formal_logic_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:formal_logic|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_global_facts_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:global_facts|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_biology_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_biology|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_chemistry_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_chemistry|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_computer_science_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_computer_science|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_european_history_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_european_history|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_geography_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_geography|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_mathematics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_mathematics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_microeconomics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_physics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_physics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_psychology_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_psychology|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_statistics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_statistics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_us_history_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_us_history|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_high_school_world_history_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:high_school_world_history|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_human_aging_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_aging|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_human_sexuality_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:human_sexuality|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_international_law_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:international_law|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_jurisprudence_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:jurisprudence|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_logical_fallacies_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:logical_fallacies|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_machine_learning_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:machine_learning|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_management_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:management|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:management|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_marketing_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:marketing|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_medical_genetics_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:medical_genetics|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_miscellaneous_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:miscellaneous|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_moral_disputes_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_disputes|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_moral_scenarios_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:moral_scenarios|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_nutrition_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:nutrition|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_philosophy_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:philosophy|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_prehistory_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:prehistory|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_professional_accounting_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_accounting|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_professional_law_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_law|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_professional_medicine_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_medicine|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_professional_psychology_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:professional_psychology|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_public_relations_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:public_relations|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_security_studies_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:security_studies|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_sociology_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:sociology|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_us_foreign_policy_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:us_foreign_policy|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_virology_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:virology|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:virology|5_2023-08-28T21:15:14.969062.parquet'
- config_name: original_mmlu_world_religions_5
data_files:
- split: 2023_08_28T21_15_14.969062
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:15:14.969062.parquet'
- split: latest
path:
- '**/details_original|mmlu:world_religions|5_2023-08-28T21:15:14.969062.parquet'
- config_name: results
data_files:
- split: 2023_08_23T18_07_59.118983
path:
- results_2023-08-23T18:07:59.118983.parquet
- split: 2023_08_24T00_29_23.220857
path:
- results_2023-08-24T00:29:23.220857.parquet
- split: 2023_08_28T21_15_14.969062
path:
- results_2023-08-28T21:15:14.969062.parquet
- split: 2023_09_09T17_37_15.988083
path:
- results_2023-09-09T17-37-15.988083.parquet
- split: 2023_12_03T00_30_57.404111
path:
- results_2023-12-03T00-30-57.404111.parquet
- split: latest
path:
- results_2023-12-03T00-30-57.404111.parquet
---
# Dataset Card for Evaluation run of facebook/opt-66b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/facebook/opt-66b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [facebook/opt-66b](https://huggingface.co/facebook/opt-66b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_facebook__opt-66b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T00:30:57.404111](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-66b/blob/main/results_2023-12-03T00-30-57.404111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224556
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224556
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/socie_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of socie/ソシエ (Granblue Fantasy)
This is the dataset of socie/ソシエ (Granblue Fantasy), containing 243 images and their tags.
The core tags of this character are `animal_ears, long_hair, breasts, blue_eyes, hair_ornament, fox_ears, large_breasts, tail, bangs, fox_tail, very_long_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 243 | 331.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 243 | 212.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 549 | 418.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 243 | 303.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 549 | 559.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/socie_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/socie_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, cleavage, collarbone, erune, solo, blush, simple_background, white_background, fur_trim, looking_at_viewer, detached_sleeves, upper_body |
| 1 | 8 |  |  |  |  |  | 1girl, cleavage, erune, fox_shadow_puppet, looking_at_viewer, smile, solo, blush, detached_sleeves, collarbone, sideboob |
| 2 | 6 |  |  |  |  |  | 1girl, detached_sleeves, erune, looking_at_viewer, sideboob, solo, bare_back, looking_back, backless_outfit, smile, blush |
| 3 | 6 |  |  |  |  |  | 1girl, blunt_bangs, cleavage, erune, looking_at_viewer, navel, official_alternate_costume, smile, solo, bare_shoulders, parted_lips, simple_background, white_background, white_bikini, blush, hair_flower, bracelet, collarbone, holding, quill, see-through |
| 4 | 17 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, erune, looking_at_viewer, solo, white_gloves, blunt_bangs, smile, blush, thighhighs, cleavage, quill, mismatched_legwear, white_dress, fingerless_gloves, parted_lips, holding, sitting, cat_ears, simple_background |
| 5 | 6 |  |  |  |  |  | 1girl, black_jacket, erune, looking_at_viewer, open_jacket, smile, solo, thighhighs, blunt_bangs, blush, long_sleeves, mismatched_legwear, parted_lips, ribbed_dress, belt, feathers, quill, simple_background, crossed_legs, one_eye_closed, sitting, thighs, white_background |
| 6 | 5 |  |  |  |  |  | 1boy, 1girl, blush, erune, nipples, open_mouth, solo_focus, sweat, hetero, navel, fang, nude, penis, pussy_juice, barefoot, censored, collarbone, detached_sleeves, feet, heart-shaped_pupils, looking_at_viewer, saliva, sex_from_behind, spread_legs, tears, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | collarbone | erune | solo | blush | simple_background | white_background | fur_trim | looking_at_viewer | detached_sleeves | upper_body | fox_shadow_puppet | smile | sideboob | bare_back | looking_back | backless_outfit | blunt_bangs | navel | official_alternate_costume | bare_shoulders | parted_lips | white_bikini | hair_flower | bracelet | holding | quill | see-through | elbow_gloves | white_gloves | thighhighs | mismatched_legwear | white_dress | fingerless_gloves | sitting | cat_ears | black_jacket | open_jacket | long_sleeves | ribbed_dress | belt | feathers | crossed_legs | one_eye_closed | thighs | 1boy | nipples | open_mouth | solo_focus | sweat | hetero | fang | nude | penis | pussy_juice | barefoot | censored | feet | heart-shaped_pupils | saliva | sex_from_behind | spread_legs | tears | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------------|:--------|:-------|:--------|:--------------------|:-------------------|:-----------|:--------------------|:-------------------|:-------------|:--------------------|:--------|:-----------|:------------|:---------------|:------------------|:--------------|:--------|:-----------------------------|:-----------------|:--------------|:---------------|:--------------|:-----------|:----------|:--------|:--------------|:---------------|:---------------|:-------------|:---------------------|:--------------|:--------------------|:----------|:-----------|:---------------|:--------------|:---------------|:---------------|:-------|:-----------|:---------------|:-----------------|:---------|:-------|:----------|:-------------|:-------------|:--------|:---------|:-------|:-------|:--------|:--------------|:-----------|:-----------|:-------|:----------------------|:---------|:------------------|:--------------|:--------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | X | X | X | | | | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 17 |  |  |  |  |  | X | X | | X | X | X | X | | | X | | | | X | | | | | X | | | X | X | | | | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | X | X | X | X | | X | | | | X | | | | | X | | | | X | | | | | X | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | X | | X | | | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
waadarsh/magnite-dataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 38650
num_examples: 262
download_size: 16501
dataset_size: 38650
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Cheetor1996/Asuka_yamayoshi_style | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
**Asuka Langley Soryu** - *Yamayoshi style*
- *Trained with anime (full-final-pruned) model*
- *Works best with ALL, MIDD, OUTD, OUTALL, and with 0.7+ weights* |
plutokokoa/translation-for-yu-gi-oh-ja-traditional-zh | ---
license: apache-2.0
dataset_info:
features:
- name: jp
dtype: string
- name: ch
dtype: string
splits:
- name: train
num_bytes: 7474238
num_examples: 10536
download_size: 2293121
dataset_size: 7474238
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nadav/pixel_glue_rte_noisy_ocr | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 4051661
num_examples: 12450
- name: validation
num_bytes: 85353
num_examples: 277
download_size: 2835457
dataset_size: 4137014
---
# Dataset Card for "pixel_glue_rte_noisy_ocr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/VQAv2_sample_validation_20 | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_ViT_L_14
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: new_info_captions3
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence:
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_without_filtering
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: validation
num_bytes: 7350896.0
num_examples: 20
download_size: 5171987
dataset_size: 7350896.0
---
# Dataset Card for "VQAv2_sample_validation_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reshabhs/SPML_Chatbot_Prompt_Injection | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- prompt-injection
- prompt-attack
- llm-safety
- llm-defense
- system-prompt
- malicious-user-prompt
pretty_name: SPML
size_categories:
- 10K<n<100K
---
# SPML Chatbot Prompt Injection Dataset
[Arxiv Paper](https://arxiv.org/abs/2402.11755)
Introducing the SPML Chatbot Prompt Injection Dataset: a robust collection of system prompts designed to create realistic chatbot interactions, coupled with a diverse array of annotated user prompts that attempt to carry out prompt injection attacks. While other datasets in this domain have centered on less practical chatbot scenarios or have limited themselves to "jailbreaking" – just one aspect of prompt injection – our dataset offers a more comprehensive approach. It not only features realistic chatbot definition and user prompts but also seamlessly integrates with existing prompt injection datasets.
Our primary focus is on the actual content of prompt injection payloads, as opposed to the methodologies used to execute the attacks. We are convinced that honing in on the detection of the payload content will yield a more robust defense strategy than one that merely identifies varied attack techniques.
## Dataset Description
| | Field | Description |
|----|-----------------|----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 1 | System Prompt | These are the intended prompts for the chatbot, designed for use in realistic scenarios. |
| 2 | User Prompt | This field contains user inputs that query the chatbot with the system prompt described in (1). |
| 3 | Prompt Injection| This is set to 1 if the user input provided in (2) attempts to perform a prompt injection attack on the system prompt (1). |
| 4 | Degree | This measures the intensity of the injection attack, indicating the extent to which the user prompt violates the chatbot's expected operational parameters.|
| 5 | Source | This entry cites the origin of the attack technique used to craft the user prompt. |
## Dataset Generation Methodology
Our process begins with an initial set of system prompts derived from leaked system prompts from several widely-used chatbots powered by LLMs. We employ GPT-4 to extrapolate from these cases, crafting additional system prompts that emulate the style of the original seeds across diverse subject matters. These prompts are then used to create corresponding valid user input for each generated system prompt. To facilitate the creation of prompts for prompt injection attacks, we dissect each generated system prompt to identify a set of guiding principles or rules they aim to uphold, such as 'speak courteously'. GPT-4 is then tasked with producing an inverse list that semantically negates each rule; for instance, 'speak courteously' is countered with 'speak rudely'. From this inverse list, multiple rules are selected at random—the quantity of which dictates the complexity of the attack (degree)—and these are provided to GPT-4 alongside an 'attack seed prompt'. The objective is to craft a user prompt that aligns with the chosen contrarian rules but retains the stylistic nuances of the attack seed prompt. This tailored seed prompt may also integrate various other attack strategies, enhancing the sophistication and realism of the generated scenarios.
## FAQs
- Should I use this dataset to train my prompt injection detection model?
It is not advisable to train prompt injection detection models on this dataset. Typically, such models look for patterns in user prompts to detect prompt injections. However, the injection payloads in our dataset are subtle and may not be universally malicious. Training your model on the combinations of system and user prompts from our dataset will not ensure generalization until the model understands how the system prompt can be violated by the user prompt. These models require exposure to a wide range of attack techniques, and since our dataset only includes a limited selection applied to diverse payloads, it is not an ideal training source.
- Why were "jailbreak" datasets not included when jailbreaking is considered a form of prompt injection?
For the purpose of this dataset, we only considered sources like TensorTrust and Gandalf that provided precise system prompts. The jailbreak dataset is composed of user prompts designed to create LLM responses that breach ethical guidelines without accompanying system prompts. At the time of development, we lacked a clearly defined system prompt to encapsulate this, hence its exclusion.
- Why haven't attack prompts based on TensorTrust been released?
The TensorTrust dataset is not licensed for distribution, which precludes us from releasing attack prompts derived from it.
## Cite
```
@misc{sharma2024spml,
title={SPML: A DSL for Defending Language Models Against Prompt Attacks},
author={Reshabh K Sharma and Vinayak Gupta and Dan Grossman},
year={2024},
eprint={2402.11755},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
### Disclaimer
Please be aware that the dataset provided herein may contain information that could be potentially used for harmful purposes. By accessing and utilizing this data, you acknowledge and agree to bear sole responsibility for any such misuse. It is expected that all users will handle the dataset ethically. We, the providers of this data, expressly disclaim any liability for any improper or illicit use of the data and for any consequences that may arise as a result thereof.
By proceeding to use this dataset, you affirm your commitment to ethical conduct and responsible use of the data provided. |
Timbrt/MuLMS-Img | ---
license: cc-by-sa-4.0
task_categories:
- image-classification
- text-to-image
- object-detection
language:
- en
pretty_name: Multi Layer Materials Science Image Corpus
size_categories:
- 1K<n<10K
---
# Multi Layer Materials Science Image Corpus
This repository contains companion material for the following [publication](https://openaccess.thecvf.com/content/WACV2024/papers/Tarsi_SciOL_and_MuLMS-Img_Introducing_a_Large-Scale_Multimodal_Scientific_Dataset_and_WACV_2024_paper.pdf):
> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. **SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain.** WACV 2024.
Please cite this paper if using the dataset, and direct any questions regarding the dataset
to [Tim Tarsi](mailto:tim.tarsi@gmail.com)
## Summary
The Multi-Layer Materials Science (MuLMS) corpus [1] is a dataset of 50 scientific publications in the materials science domain annotated for various natural language processing tasks. MuLMS-Img extends this dataset by providing over 14500 high quality, manual annotations for various image-text tasks, e.g., Figure type Classification, Optical Character Recognition (OCR) and Text Role Labeling and Figure Retrieval.
## Data Format
We provide the annotations of our dataset in the JSON format, split into a train, test and dev set. Images are provided as PNG files.
## Annotation Schema
Annotations are structured as in the following schema:
```
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"task1": {
"name": "Chart Classification",
"output": {
"chart_type": {
"type": "string"
}
}
},
"task2": {
"name": "Text Detection and Recognition",
"output": {
"text_blocks": {
"type": "array"
}
}
},
"task3": {
"name": "Image Retrieval",
"output": {
"caption": {
"type": "string"
},
"queries": {
"type": "array"
}
}
}
}
}
```
## Proposed Tasks
In our paper, we introduce the following subtasks and provide human annotations to develop computational models.
**Figure Type Classification** constitutes a multi-class classification task of identifying the type of a figure, e.g., chart types such as bar plots, photographs or illustrations.
**Optical Character Recognition (OCR) and Role Labeling** requires bounding-box detection and transcription of the text within the bounding box, plus identifying the role of the content in the figure, e.g., ticks, legends, or axis labels.
**Figure Retrieval** is based on brief, *search-style* textual queries.
Our aim is to create real-world search queries that might be used in a retrieval system, where the style typically deviates from the descriptive and wordy nature of captions.
## Citation
If you use our dataset in your work, please cite our paper:
```
@InProceedings{Tarsi_2024_WACV,
author = {Tarsi, Tim and Adel, Heike and Metzen, Jan Hendrik and Zhang, Dan and Finco, Matteo and Friedrich, Annemarie},
title = {SciOL and MuLMS-Img: Introducing a Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {January},
year = {2024},
pages = {4560-4571}
}
```
## License
The MuLMS-Img corpus is released under the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/) license.
## References
[1] Timo Pierre Schrader, Matteo Finco, Stefan Grünewald, Felix Hildebrand and Annemarie Friedrich. MuLMS: A Multi-Layer Annotated Text Corpus for Information Extraction in the Materials Science Domain. WIESP 2023. |
Multimodal-Fatima/Caltech101_with_background_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': accordion
'1': airplanes
'2': anchor
'3': ant
'4': background google
'5': barrel
'6': bass
'7': beaver
'8': binocular
'9': bonsai
'10': brain
'11': brontosaurus
'12': buddha
'13': butterfly
'14': camera
'15': cannon
'16': car side
'17': ceiling fan
'18': cellphone
'19': chair
'20': chandelier
'21': cougar body
'22': cougar face
'23': crab
'24': crayfish
'25': crocodile
'26': crocodile head
'27': cup
'28': dalmatian
'29': dollar bill
'30': dolphin
'31': dragonfly
'32': electric guitar
'33': elephant
'34': emu
'35': euphonium
'36': ewer
'37': faces
'38': faces easy
'39': ferry
'40': flamingo
'41': flamingo head
'42': garfield
'43': gerenuk
'44': gramophone
'45': grand piano
'46': hawksbill
'47': headphone
'48': hedgehog
'49': helicopter
'50': ibis
'51': inline skate
'52': joshua tree
'53': kangaroo
'54': ketch
'55': lamp
'56': laptop
'57': leopards
'58': llama
'59': lobster
'60': lotus
'61': mandolin
'62': mayfly
'63': menorah
'64': metronome
'65': minaret
'66': motorbikes
'67': nautilus
'68': octopus
'69': okapi
'70': pagoda
'71': panda
'72': pigeon
'73': pizza
'74': platypus
'75': pyramid
'76': revolver
'77': rhino
'78': rooster
'79': saxophone
'80': schooner
'81': scissors
'82': scorpion
'83': sea horse
'84': snoopy
'85': soccer ball
'86': stapler
'87': starfish
'88': stegosaurus
'89': stop sign
'90': strawberry
'91': sunflower
'92': tick
'93': trilobite
'94': umbrella
'95': watch
'96': water lilly
'97': wheelchair
'98': wild cat
'99': windsor chair
'100': wrench
'101': yin yang
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: LLM_Description_opt175b_downstream_tasks_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003_full
sequence: string
- name: Attributes_ViT_L_14_text_davinci_003_caltech101
sequence: string
- name: clip_tags_ViT_L_14_with_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_wo_openai_classes
sequence: string
- name: clip_tags_ViT_L_14_simple_specific
dtype: string
- name: clip_tags_ViT_B_16_simple_specific
dtype: string
- name: clip_tags_ViT_B_16_ensemble_specific
dtype: string
- name: clip_tags_ViT_B_32_simple_specific
dtype: string
- name: clip_tags_ViT_B_32_ensemble_specific
dtype: string
- name: Attributes_ViT_B_16_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_simple_specific
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B_ensemble_specific
dtype: string
splits:
- name: train
num_bytes: 49965015.0
num_examples: 3060
download_size: 45077220
dataset_size: 49965015.0
---
# Dataset Card for "Caltech101_with_background_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_correlative_constructions | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 2433
num_examples: 15
- name: test
num_bytes: 6242
num_examples: 35
- name: train
num_bytes: 42605
num_examples: 261
download_size: 32344
dataset_size: 51280
---
# Dataset Card for "MULTI_VALUE_sst2_correlative_constructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LocalDoc/squad_azerbaijan | ---
language:
- az
license: cc-by-nc-2.0
size_categories:
- 100K<n<1M
task_categories:
- question-answering
pretty_name: SQuAD Azerbaijani Dataset
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer_text
dtype: string
- name: answer_start
dtype: float64
- name: is_impossible
dtype: bool
splits:
- name: train
num_bytes: 131111739
num_examples: 130319
- name: test
num_bytes: 28387649
num_examples: 26247
download_size: 19290633
dataset_size: 159499388
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# SQuAD Azerbaijani Dataset
## Description
This dataset is the Azerbaijani version of the Stanford Question Answering Dataset (SQuAD), automatically translated from the original English dataset. SQuAD is a prominent dataset in natural language processing, used for machine comprehension and question-answering tasks. It consists of questions based on a set of Wikipedia articles, where the answer to each question is a segment of text from the corresponding article.
## Dataset Structure
### Data Fields
- `id`: a unique identifier for each question-answer pair.
- `title`: the title of the Wikipedia article from which the context is extracted.
- `context`: a segment of text from the Wikipedia article that contains the information necessary to answer the question.
- `question`: the question posed, translated into Azerbaijani.
- `answers`: a list containing:
- `text`: the segment of text that answers the question.
- `answer_start`: the position of the answer's first character in the context.
### Data Splits
The dataset is split into two subsets: `train` and `test`. The `train` subset is used for training models, while the `test` subset is for validating and testing them.
## Licensing Information
This work is licensed under a Creative Commons Attribution Non-Commercial 2.0 Generic License (CC BY-NC 2.0). This license allows others to remix, tweak, and build upon this work non-commercially, as long as they credit the creator and license their new creations under the identical terms.
## Citation
Please cite the following paper when using this dataset:
- Original SQuAD Paper Citation
## Acknowledgements
This dataset was created by [Valiyev Rashad], based on the Stanford Question Answering Dataset [https://rajpurkar.github.io/SQuAD-explorer/].
If you have any questions or suggestions, please contact us at [v.resad.89@gmail.com].
|
adsazad/gurmat-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.csv"
---
Training model for gurmatgpt |
LinaAlhuri/Arabic-COCO2014-Validation | ---
task_categories:
- image-to-text
language:
- ar
pretty_name: Arabic COCO 2014 Validation
size_categories:
- 100K<n<1M
---
# Arabic Translated COCO Validation Dataset
---
## Overview
Welcome to the Arabic Translated COCO Validation Dataset! This dataset is a version of the Common Objects in Context (COCO) dataset, specifically translated into Arabic. The COCO dataset is a widely used benchmark for image captioning and object detection tasks, and this translation aims to facilitate research and development in the Arabic language.
## Contents
1. **coco_url:** This column includes images URL which makes a subset of the COCO validation images.
2. **arabic_caption:** Arabic translations of the original COCO annotations, providing detailed information about image captions.
## Usage
- **Research and Development:** Use this dataset for training and evaluating models in the domain of image captioning and object detection with a focus on the Arabic language.
- **Benchmarking:** Evaluate the performance of your algorithms on this translated COCO dataset to contribute to the advancement of Arabic-language computer vision research.
## Dataset Translation and Bias
This dataset has been translated using the Google Translation API. It's important to note that automated translation methods, including machine translation, may introduce biases and inaccuracies. The translations are generated algorithmically and might not capture the full context or cultural nuances or might contain gender bias, leading to potential biases in the dataset. Researchers and users are advised to be mindful of these limitations and consider the implications of bias in their analyses.
|
musabg/wikipedia-tr | ---
annotations_creators:
- no-annotation
language:
- tr
language_creators:
- crowdsourced
license:
- cc-by-sa-3.0
- gfdl
multilinguality: []
pretty_name: Turkish Wikipedia 2023
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- wikipedia, wiki,
task_categories:
- fill-mask
- text-generation
task_ids:
- masked-language-modeling
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 956353353
num_examples: 520542
download_size: 529875169
dataset_size: 956353353
---
# 📖 Türkçe Vikipedi Mayıs 2023
Bu veri kümesi, Türkçe Vikipedi'den alınan makalelerin bir derlemesi olup, maskeleme dil modelleme ve metin oluşturma görevleri için tasarlanmıştır.
## 🗣️ Etiketlemeler
Bu veri kümesindeki makaleler, özellikle belirli bir görev için etiketlenmemiş olup, veri kümesi etiketsizdir.
## 🌐 Dil
Bu veri kümesi Türkçe yazılmış olup, gönüllülerden oluşan bir ekip tarafından topluluk katılımı yöntemleri ile oluşturulmuştur.
## 📜 Lisans
CC-BY-SA 3.0 ve GFDL
## 💻 Kaynak Veri Kümeleri
Bu veri kümesi, Türkçe Vikipedi'den oluşturulan orijinal bir veri kümesidir.
Türkçe Vikipedi veri kümesini kullandığınız için teşekkürler! Dil modelleme ve metin oluşturma görevleriniz için faydalı olmasını umuyoruz.
---
# 📖 Wikipedia Turkish 2023
This dataset is a collection of articles from the Turkish Wikipedia and is designed to be used for masked language modeling and text generation tasks.
## 📚 Dataset Info
Processed and cleaned using Huggingface wikipedia cleaner.
## 🗣️ Annotations
The articles in this dataset were not specifically annotated for any particular task, meaning that the dataset is unlabeled.
## 🌐 Language
This dataset is written in Turkish and was created using crowdsourcing methods by a team of volunteers.
## 📜 License
CC-BY-SA 3.0 and GFDL
## 💻 Source Datasets
This dataset is an original dataset created from the Turkish Wikipedia.
|
DatasetingBR/Juh | ---
license: openrail
---
|
Ramitha/spanish-legal-data-lite | ---
dataset_info:
features:
- name: Data
dtype: string
splits:
- name: train
num_bytes: 122971
num_examples: 501
download_size: 62737
dataset_size: 122971
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "spanish-legal-data-lite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phil20/indian_food_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1211041119.0714333
num_examples: 5328
- name: test
num_bytes: 238879486.3925666
num_examples: 941
download_size: 1600841122
dataset_size: 1449920605.464
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
piercus/unsplash-lite-palette | ---
license: other
license_name: unsplash-commercial
license_link: https://github.com/unsplash/datasets/blob/master/DOCS.md
---
|
sl-alex/openai-prm800k-stepwise-best | ---
license: mit
---
Denormalized dataset created by processing OpenAI's [PRM800K](https://github.com/openai/prm800k/tree/main) process supervision dataset via [prm800k-denorm](https://github.com/scottlogic-alex/prm800k-denorm).
Consists of samples of "what's been said so far" + "what's the next step in the conversation".
Filtered to just conversation turns which progressed the solution. Where multiple constructive responses were available: we pick only the best (as rated by the human evaluator).
Dataset description and usage instructions in [prm800k-denorm README](https://github.com/scottlogic-alex/prm800k-denorm/blob/main/README.md). |
eminorhan/llm-memory | ---
license: mit
---
This repository contains the results of all experiments (inlcuding every single hyperparameter run) reported in the following paper:
Orhan AE (2023) [Recognition, recall, and retention of few-shot memories in large language models.](https://arxiv.org/abs/2303.17557) arXiv:2303.17557.
A brief description of the directories included in this repository:
* [`evals`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/evals): contains the results of all recognition experiments
* [`recalls`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/recalls): contains the results of all recall experiments
* [`re-evals`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/re-evals): contains the results of all recognition experiments during the retention phase
* [`re-recalls`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/re-recalls): contains the results of all recall experiments during the retention phase
* [`scratch-evals`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/scratch-evals), [`scratch-recalls`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/scratch-recalls), [`scratch-re-evals`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/scratch-re-evals), [`scratch-re-recalls`](https://huggingface.co/datasets/eminorhan/llm-memory/tree/main/scratch-re-recalls): similar to the above, but the results are for the `gpt-j-6B-st` model trained from scratch on [`wikitext-103-raw-v1`](https://huggingface.co/datasets/wikitext). |
preference-agents/enron-personalization-sample-with-metrics | ---
dataset_info:
features:
- name: id
dtype: string
- name: message_id
dtype: string
- name: from
sequence: string
- name: to
sequence: string
- name: date
dtype: string
- name: subject
dtype: string
- name: content
dtype: string
- name: email_context
dtype: string
- name: token_count_content
dtype: int32
- name: token_count_context
dtype: int32
- name: content_extracted
struct:
- name: databricks-dbrx-instruct
dtype: string
- name: databricks-llama-2-70b-chat
dtype: string
- name: databricks-mixtral-8x7b-instruct
dtype: string
- name: baseline_generated_emails
struct:
- name: databricks-dbrx-instruct
struct:
- name: databricks-dbrx-instruct
dtype: string
- name: databricks-llama-2-70b-chat
dtype: string
- name: databricks-mixtral-8x7b-instruct
dtype: string
- name: databricks-llama-2-70b-chat
struct:
- name: databricks-dbrx-instruct
dtype: string
- name: databricks-llama-2-70b-chat
dtype: string
- name: databricks-mixtral-8x7b-instruct
dtype: string
- name: databricks-mixtral-8x7b-instruct
struct:
- name: databricks-dbrx-instruct
dtype: string
- name: databricks-llama-2-70b-chat
dtype: string
- name: databricks-mixtral-8x7b-instruct
dtype: string
- name: automatic_eval
struct:
- name: databricks-dbrx-instruct
struct:
- name: databricks-dbrx-instruct
dtype: string
- name: databricks-llama-2-70b-chat
dtype: string
- name: databricks-mixtral-8x7b-instruct
dtype: string
- name: databricks-llama-2-70b-chat
struct:
- name: databricks-dbrx-instruct
dtype: string
- name: databricks-llama-2-70b-chat
dtype: string
- name: databricks-mixtral-8x7b-instruct
dtype: string
- name: databricks-mixtral-8x7b-instruct
struct:
- name: databricks-dbrx-instruct
dtype: string
- name: databricks-llama-2-70b-chat
dtype: string
- name: databricks-mixtral-8x7b-instruct
dtype: string
splits:
- name: train
num_bytes: 1609987
num_examples: 129
download_size: 889785
dataset_size: 1609987
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mmoebis/5gdata_1_test | ---
dataset_info:
features:
- name: Sentences
dtype: string
- name: Questions
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 56304
num_examples: 199
download_size: 6633
dataset_size: 56304
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yinxiang/test | ---
license: zlib
---
|
parsak/alpaca-tr-9k-longest | ---
language:
- tr
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8146532
num_examples: 9000
download_size: 4823375
dataset_size: 8146532
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Minglii/e10 | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 3496846
num_examples: 5200
download_size: 2006397
dataset_size: 3496846
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "e10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nacielo/musiccap_4 | ---
dataset_info:
features:
- name: ytid
dtype: string
- name: start_s
dtype: int64
- name: end_s
dtype: int64
- name: audioset_positive_labels
dtype: string
- name: aspect_list
dtype: string
- name: caption
dtype: string
- name: author_id
dtype: int64
- name: is_balanced_subset
dtype: bool
- name: is_audioset_eval
dtype: bool
- name: download_status
dtype: bool
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 3475708168.0
num_examples: 1000
download_size: 3416657555
dataset_size: 3475708168.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "musiccap_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
4eJIoBek/PAIT-Downloads | ---
license: unknown
---
This is a downloads of https://gz1k.itch.io/ai-portable-tools, but on huggingface for lightning speed of downloading. I hope i haven't broke ToS of Huggingface Hub by uploading these tools here.
----------------------------------
This is my collection of portable AI packages to run it fast without anxious headache in console. initially, I made these tools for myself, but maybe someone else will need them. OK, heres the list:
-TEXT-
Koboldai [CPU/CUDA] - link - also in downloads / online demo
-CHAT-
Llama 2 chat 7B 4bit koboldcpp webui [CPU] - in downloads
/ source / webui / model / online demo
Llama 2 chat 7B Luna ai uncensored 4bit koboldcpp webui (note that this is a finetune on unsupervised synthetic dataset, so it hallucinates way more strong than original llama-2-chat) [CPU] - in downloads / source / webui / model /
Vicuna 1.1 7B 4bit koboldcpp webui (much worse than llama2 above, but may be more multilingual) [CPU] - in downloads. / source / webui / model / online demo
-TRANSLATE-
Facebook NLLB 600m webui [CPU] - in downloads / source / webui / model / online demo
-MIDI MUSIC GENERATION-
Midi composer app [CUDA][CPU] - link - also in downloads / source / online demo
Multitrack midi music generator (generates short jingles, each instrument generated separately) [CPU] - in downloads / webui
-TEXT TO MUSIC/AUDIO-
AudioCraft Plus [CUDA/CPU] - in downloads / source / webui / online demo
-TEXT TO SPEECH-
Suno ai Bark webui (with zeroshot voice conversion) [CUDA/CPU] - in downloads / source / webui / online demo
Coqui XTTS webui (this one generates speech only with voice cloning) (voice cloning is more "stable" than bark, but the accent and emotions can be lost) [CUDA] - in downloads / source / webui
TorToiSe webui [CUDA/CPU] - in downloads / source / webui / online demo
-VOICE CONVERSION VIA TRAINING-
RVC singing voice cloning webui [CUDA] - link - also in downloads / source
-VOICE ZEROSHOT CONVERSION-
FreeVC webui [CPU] - in downloads / source / webui
-VOICE TO TEXT-
Whispercpp GUI [DirectX/CPU] - link - also in downloads / source / gui / online demo
-VOCALS RESTORATION-
VoiceFixer webui [CPU] - in downloads / source / webui
-DUAL SPEAKER SPEECH SEPARATION-
Dual Path RNN (cli interface) - in downloads / source
-VOCALS/STEMS EXTRACTION-
UVR [CPU/CUDA] - link - also in downloads / online demo
Demucs GUI [CPU][CUDA] - link - also in downloads / source / gui
-IMAGE COLORIZATION-
DeOldify .NET gui [CPU] - link - also in downloads / source / gui / online demo
-ZEROSHOT IMAGE MATTING-
DIS webui [CPU] - in downloads / source / webui
-IMAGE UPSCALING-
Cupscale [Vulkan/CUDA] - link - also in downloads / source / webui / online demo
Automatic1111 sdwebui with StableSR extension [CUDA/CPU] - in downloads / source / webui / extension
-TEXT2IMAGE-
Automatic1111 Stable Diffusion base (without models) - link / webui
Automatic1111 deliberate v2 (sd1.5) model [CUDA/CPU][DIRECTX/CPU] - in downloads / source / webui / directx webui / model
Automatic1111 Illuminati Diffusion (sd2.1) model [CUDA/CPU] - in downloads / source / webui / model
Fooocus (sdxl) [CUDA] - link- also in downloads / source / webui / model / refiner
ConfyUI (without models) [CUDA/CPU] - link - also in downloads / source / webui
-IMAGE EDITING BY PROMPT-
Automatic1111 Instructpix2pix (sd1.5) model [DIRECTX/CPU][CUDA/CPU] - in downloads / source / ip2p source / webui / directx webui / model
-IMAGE TO IMAGE VARIATIONS-
Automatic1111 sd-unclip (sd2.1) model [CUDA/CPU] - in downloads / source / webui / model
-IMAGE EDITING BY CONCEPTS-
LEDITS webui [CUDA/CPU] - in downloads / source / webui
-OBJECT REMOVING-
lama cleaner [CUDA] - in downloads / source / webui / online demo
-VIDEO FRAMES INTERPOLATION-
Flowframes [CUDA/Vulkan] - in downloads / source / gui
-VIDEO UPSCALING-
RealBasicVSR (cli interface) [CUDA/CPU] - in downloads / source
-TEXT2VIDEO-
Automatic1111 sdwebui with animatediff extension [CUDA/CPU] - in downloads / source / webui / extension / model / online demo
Automatic1111 sdwebui with modelscope text2video extension with zeroscope-v2-576w model [CUDA] - in downloads / source / webui / extension / model / online demo
-VIDEO HUMAN MATTING-
RobustVideoMatting (cli interface) [CUDA/CPU] - in downloads / source / online demo
-VIDEO ZERO-SHOT MATTING-
Track-anything webui [CPU] - in downloads / webui / online demo
-VIDEO FEW-SHOT MATTING VIA TRAINING-
DeepXTools by Iperov [CUDA] - link - also in downloads
-ZERO-SHOT DEEPFAKING-
Roop neurogen mod (Refacer model) (lightning fast, has realtime deepfake on webcam function) (the refacer model swaps faces better than simswap, but have only 128px resolution and may have more artifacts when head is on side) [DirectX/CUDA/CPU] - in downloads / source / webui / mod by
Deepinsight Refacer gradio webui (replaces only certain faces, has cool face upscale feature) [CUDA] - in downloads / source / webui / mod by
Simswap (cli interface) [CUDA/CPU] - in downloads / source
-DEEPFAKING VIA TRAINING-
DeepFaceLab (cli interface) [DirectX][CUDA] - link - also in downloads / source
DeepfaceLive [DirectX][CUDA] - link - also in downloads / source
-LIPS MANIPULATION ON VIDEO-
wav2lip gui [CUDA/CPU] - link - also in downloads / source / gui
-TEXT To 3D-
Shap-E webui [CUDA/CPU] -in downloads / source / webui
Point-E webui [CUDA/CPU] (results are worse than shap-e) - in downloads / source / webui
-NEURAL RADIANCE FIELDS GENERATION BY IMAGES-
nerfstudio (nerfacto) [CUDA] - in downloads / source
--------------------------------------------------------------
Alternative downloads with torrents on Archive.org: https://archive.org/details/@takeonme1?tab=uploads
Page on civitai: https://civitai.com/models/104609 |
dkshjn/processed_truthy | ---
dataset_info:
features:
- name: id
dtype: string
- name: source
dtype: string
- name: system
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: formatted_chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: formatted_rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 3097676
num_examples: 1016
download_size: 1360242
dataset_size: 3097676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "processed_truthy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_256 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 894425488.0
num_examples: 174284
download_size: 911937731
dataset_size: 894425488.0
---
# Dataset Card for "chunk_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rishabhjain16/owr_cv_albanian_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 57492146.0
num_examples: 384
download_size: 47245992
dataset_size: 57492146.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
lhallee/EC_fold | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: seqs
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 30167609
num_examples: 13089
- name: valid
num_bytes: 3394049
num_examples: 1465
- name: test
num_bytes: 3655560
num_examples: 1604
download_size: 9383528
dataset_size: 37217218
---
# Dataset Card for "EC_fold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Joshua-Abok/tiny-Open-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 12637859
num_examples: 5000
download_size: 6751148
dataset_size: 12637859
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FinancialSupport/ScrapedJobs | ---
dataset_info:
features:
- name: job_url
dtype: string
- name: site
dtype: string
- name: title
dtype: string
- name: company
dtype: string
- name: company_url
dtype: string
- name: location
dtype: string
- name: job_type
dtype: string
- name: date_posted
dtype: date32
- name: interval
dtype: string
- name: min_amount
dtype: int64
- name: max_amount
dtype: int64
- name: currency
dtype: string
- name: is_remote
dtype: bool
- name: num_urgent_words
dtype: int64
- name: benefits
dtype: string
- name: emails
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 1822669
num_examples: 408
download_size: 382933
dataset_size: 1822669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mnoukhov/openai_summarize_comparisons_tldrprompt_relabel1b | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 156593160
num_examples: 92534
- name: test
num_bytes: 8322345
num_examples: 5000
download_size: 21793816
dataset_size: 164915505
---
# Dataset Card for "openai_summarize_comparisons_tldrprompt_relabel1b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713192119 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2808439
num_examples: 8273
download_size: 1624519
dataset_size: 2808439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aegrif/CIS6930_DAAGR_Empathetic_Dialogues | ---
dataset_info:
features:
- name: conv_id
dtype: string
- name: utterance_idx
dtype: int64
- name: context
dtype: string
- name: prompt
dtype: string
- name: utterance
dtype: string
- name: new_context
dtype: string
- name: previous_utterance
dtype: string
splits:
- name: train
num_bytes: 23146751
num_examples: 84167
- name: validation
num_bytes: 3522545
num_examples: 12077
- name: test
num_bytes: 3490587
num_examples: 10973
download_size: 11165291
dataset_size: 30159883
---
# Dataset Card for "CIS6930_DAAGR_Empathetic_Dialogues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WasuratS/ECMWF_Thailand_Land_Air_Temperatures | ---
license: eupl-1.1
task_categories:
- time-series-forecasting
tags:
- climate
size_categories:
- 100M<n<1B
---
# Dataset Summary
Contains hourly 2 meters of land (on-shore) air temperature data within grid areas of Thailand country. <br/>
Data is retrieved from [Corpernicus Climate Data Store](https://cds.climate.copernicus.eu/cdsapp#!/home) on [ERA5-Land hourly data from 1950 to present](https://cds.climate.copernicus.eu/cdsapp#!/dataset/10.24381/cds.e2161bac?tab=overview)
<br/>
Thailand areas in this context is **Latitude** = **[5.77434, 20.43353]** and **Longitude** = **[97.96852, 105.22908]** <br/>
For more details of data, you can refer to [ERA5-Land hourly data from 1950 to present](https://cds.climate.copernicus.eu/cdsapp#!/dataset/reanalysis-era5-land?tab=overview)
- Data Granularity: Hourly per Latitude/ Longitude
- Period: **31/Dec/1999** - **08/May/2023**
- Temperature Unit: Celsius (°C) (Original data from [ERA5-Land hourly data from 1950 to present](https://cds.climate.copernicus.eu/cdsapp#!/dataset/10.24381/cds.e2161bac?tab=overview) is Kelvin)
# Source Data
- Organization of the producer: ECMWF
# Data Creation
Below is an example of how to make data query using Python via [CDS API](https://cds.climate.copernicus.eu/api-how-to) in monthly requests. <br/>
Script can be found [here](https://huggingface.co/datasets/WasuratS/ECMWF_Thailand_Land_Air_Temperatures/blob/main/cds_api_requestor_example.py)
``` python
import cdsapi
c = cdsapi.Client()
month_list = [str(num).zfill(2) for num in range(1, 13)]
day_list = [str(num).zfill(2) for num in range(1, 32)]
time_list = [str(num).zfill(2) + ":00" for num in range(0, 24)]
year_list = [str(num) for num in range(2000, 2022)]
for year in year_list:
for month in month_list:
c.retrieve('reanalysis-era5-land',
{
'variable': [
'2m_temperature']
,
'year': year,
'month' : month,
'day': day_list,
'time': time_list,
'format': 'grib',
'area': [
20.43, 97.96, 5.77,
105.22,
],
},
f'{year}_{month}_hourly_2m_temp_TH.grib')
```
Direct file output from API is in ```.grib``` format, to make it easy for further analysis work, I have converted it to ```.parquet``` format. <br/>
To convert GRIB format to pandas dataframe, you can use [xrray](https://github.com/pydata/xarray) and [cfgrib](https://github.com/ecmwf/cfgrib) library to help as below example snippet of code.
``` python
import xarray as xr
import cfgrib
ds = xr.open_dataset('2022_12_31_hourly_2m_temp_TH.grib', engine='cfgrib')
df = ds.to_dataframe().reset_index()
```
## Licensing
[Climate Data Store Product Licensing](https://cds.climate.copernicus.eu/api/v2/terms/static/licence-to-use-copernicus-products.pdf)
## Citation
- This data was generated using **Copernicus Climate Change Service** information and <br/>
contains modified **Copernicus Climate Change Service** information on 1999/Dec/31 - 2023/May/08 data period
- Muñoz Sabater, J. (2019): ERA5-Land hourly data from 1950 to present. <br/>
Copernicus Climate Change Service (C3S) Climate Data Store (CDS). <br/>
DOI: [10.24381/cds.e2161bac](https://cds.climate.copernicus.eu/cdsapp#!/dataset/10.24381/cds.e2161bac?tab=overview) (Accessed on 13-May-2023)
- Copernicus Climate Change Service (C3S) (2022): ERA5-Land hourly data from 1950 to present. <br/>
Copernicus Climate Change Service (C3S) Climate Data Store (CDS). <br/>
DOI: [10.24381/cds.e2161bac](https://cds.climate.copernicus.eu/cdsapp#!/dataset/10.24381/cds.e2161bac?tab=overview) (Accessed on 13-May-2023) |
Isaak-Carter/MAIN_JOSIE_wizard_vicuna_70k_unfiltered_de | ---
dataset_info:
features:
- name: sample
dtype: string
splits:
- name: train
num_bytes: 162205855
num_examples: 34598
download_size: 79618801
dataset_size: 162205855
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
```text
\n<|gökdeniz|>{input}<|endoftext|>\n<|josie|>{respond}<|endoftext|>
```
```text
\n<|gökdeniz|>Wählen Sie alle geraden Zahlen aus der Liste aus.\n17, 8, 3, 22, 9<|endoftext|>\n<|josie|>
``` |
CyberHarem/brigid_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of brigid (Fire Emblem)
This is the dataset of brigid (Fire Emblem), containing 105 images and their tags.
The core tags of this character are `blonde_hair, long_hair, headband, breasts, yellow_eyes, large_breasts, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 105 | 135.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/brigid_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 105 | 72.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/brigid_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 235 | 155.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/brigid_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 105 | 114.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/brigid_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 235 | 220.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/brigid_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/brigid_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, cleavage, gloves, simple_background, solo, belt, black_headband, looking_at_viewer, wavy_hair, white_background, blush, closed_mouth, dress, choker, medium_breasts, open_mouth, smile, weapon |
| 1 | 23 |  |  |  |  |  | 1girl, solo, arrow_(projectile), bow_(weapon), dress, belt, holding_weapon, fingerless_gloves, armor, smile, thighhighs, very_long_hair, looking_at_viewer, low-tied_long_hair, elbow_gloves, quiver |
| 2 | 5 |  |  |  |  |  | 1girl, choker, cleavage, solo, belt, black_gloves, black_headband, black_thighhighs, thighs, wavy_hair, blush, dress, elbow_gloves, looking_at_viewer, simple_background, white_background, closed_mouth, grin |
| 3 | 5 |  |  |  |  |  | cleavage, navel, 1girl, solo, bare_shoulders, collarbone, looking_at_viewer, simple_background, white_background, alternate_costume, ass_visible_through_thighs, black_bikini, cowboy_shot, fingerless_gloves, low-tied_long_hair, stomach |
| 4 | 13 |  |  |  |  |  | 1girl, nipples, hetero, solo_focus, sex, sweat, vaginal, blush, open_mouth, penis, 1boy, medium_breasts, pubic_hair, pussy, breasts_out, completely_nude, mosaic_censoring, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | gloves | simple_background | solo | belt | black_headband | looking_at_viewer | wavy_hair | white_background | blush | closed_mouth | dress | choker | medium_breasts | open_mouth | smile | weapon | arrow_(projectile) | bow_(weapon) | holding_weapon | fingerless_gloves | armor | thighhighs | very_long_hair | low-tied_long_hair | elbow_gloves | quiver | black_gloves | black_thighhighs | thighs | grin | navel | bare_shoulders | collarbone | alternate_costume | ass_visible_through_thighs | black_bikini | cowboy_shot | stomach | nipples | hetero | solo_focus | sex | sweat | vaginal | penis | 1boy | pubic_hair | pussy | breasts_out | completely_nude | mosaic_censoring |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:---------|:--------------------|:-------|:-------|:-----------------|:--------------------|:------------|:-------------------|:--------|:---------------|:--------|:---------|:-----------------|:-------------|:--------|:---------|:---------------------|:---------------|:-----------------|:--------------------|:--------|:-------------|:-----------------|:---------------------|:---------------|:---------|:---------------|:-------------------|:---------|:-------|:--------|:-----------------|:-------------|:--------------------|:-----------------------------|:---------------|:--------------|:----------|:----------|:---------|:-------------|:------|:--------|:----------|:--------|:-------|:-------------|:--------|:--------------|:------------------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | | | | X | X | | X | | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | X | | | X | | X | | | | | | | | | | | | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | | | | | | | | | | X | | | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/093da9e6 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1330
dataset_size: 180
---
# Dataset Card for "093da9e6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mtzeve/neuro_patents_bds | ---
dataset_info:
features:
- name: appln_id
dtype: int64
- name: appln_filing_date
dtype: string
- name: docdb_family_id
dtype: int64
- name: granted
dtype: string
- name: appln_abstract
dtype: string
- name: appln_abstract_lg
dtype: string
- name: appln_title
dtype: string
- name: applt_coun
dtype: string
- name: invt_coun
dtype: string
- name: cpc
dtype: string
- name: ipc
sequence: string
- name: __index_level_0__
dtype: int64
- name: input
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 13225.2
num_examples: 6
download_size: 28120
dataset_size: 13225.2
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-xsum-default-1c6815-27497144913 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: pszemraj/led-large-book-summary
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/led-large-book-summary
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@kaprerna135](https://huggingface.co/kaprerna135) for evaluating this model. |
tarteel-ai/tlog | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': media
'1': recordings
'2': unidentified
splits:
- name: train
num_bytes: 370981663502.674
num_examples: 719853
download_size: 552670139289
dataset_size: 370981663502.674
---
# Dataset Card for "old"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bluebomber182/Wilbur-Robinson | ---
license: mit
---
|
OpenDFM/MULTI-Benchmark | ---
license: mit
language:
- zh
pretty_name: MULTI-Benchmark
viewer: False
---
# 🖼️ MULTI-Benchmark: Multimodal Understanding Leaderboard with Text and Images
<div align="center">

🌐 [Website](https://OpenDFM.github.io/MULTI-Benchmark/) | 📃 [Paper](https://arxiv.org/abs/2402.03173/) | 🤗 [Dataset](https://huggingface.co/datasets/OpenDFM/MULTI-Benchmark) | 📮 [Submit](https://opendfm.github.io/MULTI-Benchmark/static/pages/submit.html)
[简体中文](./README_zh.md) | English
</div>
## 🔥 News
- **[2024.3.4]** We have released the [evaluation page](https://OpenDFM.github.io/MULTI-Benchmark/static/pages/submit.html).
- **[2024.2.19]** We have released the [HuggingFace Page](https://huggingface.co/datasets/OpenDFM/MULTI-Benchmark/).
- **[2024.2.6]** We have published our [paper](https://arxiv.org/abs/2402.03173/) on arXiv.
- **[2023.12.7]** We have released the [code](https://github.com/OpenDFM/MULTI-Benchmark/tree/main/eval) of our benchmark evaluation.
- **[2023.12.5]** We have released the [GitHub Page](https://OpenDFM.github.io/MULTI-Benchmark/).
## 📖 Overview
Rapid progress in multimodal large language models (MLLMs) highlights the need to introduce challenging yet realistic benchmarks to the academic community, while existing benchmarks primarily focus on understanding simple natural images and short context. In this paper, we present***MULTI***, as a cutting-edge benchmark for evaluating MLLMs on understanding complex tables and images, and reasoning with long context. **MULTI** provides multimodal inputs and requires responses that are either precise or open-ended, reflecting real-life examination styles. **MULTI** includes over 18,000 questions and challenges MLLMs with a variety of tasks, ranging from formula derivation to image detail analysis and cross-modality reasoning. We also introduce***MULTI-Elite***, a 500-question selected hard subset, and ***MULTI-Extend***, with more than 4,500 external knowledge context pieces. Our evaluation indicates significant potential for MLLM advancement, with GPT-4V achieving a **63.7%** accuracy rate on **MULTI**, in contrast to other MLLMs scoring between **28.5%** and **55.3%**. **MULTI** serves not only as a robust evaluation platform but also paves the way for the development of expert-level AI.
## 🏆 Leaderboard
| Modality | Model | Version | Overall | MULTI-Elite |
|:--------:|:-------------:| -------------------------- |:-------:|:-----------:|
| 🖼️ | GPT-4V | gpt-4-vision-preview | 63.7 | 14.0 |
| 🖼️ | Yi-VL | Yi-34B-Chat | 55.3 | 26.2 |
| 🖼️ | Gemini Vision | gemini-pro-vision | 53.7 | 12.4 |
| 📃 | Gemini | gemini-pro | 52.2 | 10.5 |
| 📃 | GPT-4 | gpt-4-1106-preview | 50.2 | 5.8 |
| 📃 | DFM-2.0 | dfm-2.0-70b-preview | 49.7 | 18.0 |
| 🖼️ | InternVL | InternVL-Chat-Chinese-V1.1 | 44.9 | 20.7 |
| 🖼️ | Qwen-VL | Qwen-VL-Chat | 39.0 | 10.5 |
| 📃 | ChatGPT | gpt-3.5-turbo-1106 | 35.9 | 4.7 |
| 🖼️ | VisCPM | VisCPM-Chat | 33.4 | 13.0 |
| 📃 | MOSS | moss-moon-003-sft | 32.6 | 13.1 |
| 🖼️ | VisualGLM | visualglm-6b | 31.1 | 12.8 |
| 🖼️ | Chinese-LLaVA | Chinese-LLaVA-Cllama2 | 28.5 | 12.3 |
## ⏬ Download
You can simply download data using the following command:
```shell
cd eval
python download_data.py
```
The structure of `./data` should be something like:
```
./data
├── images # folder containing images
├── problem_v1.2.2_20240212_release.json # MULTI
├── knowledge_v1.2.2_20240212_release.json # MULTI-Extend
├── hard_list_v1.2.1_20240206.json # MULTI-Elite
└── captions_v1.2.0_20231217.csv # image captions generated by BLIP-6.7b
```
## 📝 How to Evaluate
We provide a unified evaluation framework in `eval`. Each file in `eval/models` contains an evaluator specified to one M/LLM, and implements a `generate_answer` method to receive a question as input and give out the answer of it.
```shell
cd eval
python eval.py -h # to list all supported arguments
python eval.py -l # to list all supported models
```
### Environment Preparation Before Usage
Each evaluator requires its unique environment setting, and a universal environment may not work for all evaluators. **Just follow the official guide.** If the corresponding model runs well, then so should it fit in our framework.
You just need to install another two packages to run the evaluation code:
```shell
pip install tiktoken tqdm
```
If you just want to generate data for a specific setting (using `--debug` argument), this line above is all you need.
### Running Evaluation
For a quick start, see these examples:
Test GPT-4V model on whole MULTI with multimodal input, using MULTI-Extend as external knowledge:
```shell
python eval.py \
--problem_file ../data/problem_v1.2.2_20240212_release.json \
--knowledge_file ../data/knowledge_v1.2.2_20240212_release.json \
--questions_type 0,1,2,3 \
--image_type 0,1,2 \
--input_type 2 \
--model gpt-4v \
--model_version gpt-4-vision-preview \
--api_key sk-************************************************
```
Test Qwen-VL model on MULTI-Elite with image caption input, skip all questions not containing images, evaluate only multiple-choice questions, automatically set cuda device:
```shell
python eval.py \
--problem_file ../data/problem_v1.2.2_20240212_release.json \
--subset ../data/hard_list_v1.2.1_20240206.json \
--caption_file ../data/captions_v1.2.0_20231217.csv \
--questions_type 0,1 \
--image_type 1,2 \
--input_type 1 \
--model qwen-vl \
--model_dir ../models/Qwen-VL-Chat
```
The evaluation script will generate a folder named `results` under the root directory, and the result will be saved in `../results/EXPERIMENT_NAME`. During the evaluation, the script will save checkpoints in `../results/EXPERIMENT_NAME/checkpoints`, you can delete them after the evaluation is done. If the evaluation is interrupted, you can continue from the last checkpoint:
```shell
python eval.py \
--checkpoint_dir ../results/EXPERIMENT_NAME
```
Most of the arguments are saved in `../results/EXPERIMENT_NAME/args.json`, so you can continue the evaluation without specifying all the arguments again. Please note that `--api_key` is not saved in `args.json` for security reasons, so you need to specify it again.
```shell
python eval.py \
--checkpoint_dir ../results/EXPERIMENT_NAME \
--api_key sk-************************************************
```
For more details of arguments, please use `python eval.py -h`, and refer to `args.py` and `eval.py`.
### Add Support for Your Models
It's recommended to read the code of the other given evaluators in `eval/models` before your implementation.
Create `class YourModelEvaluator` and implement `generate_answer(self, question:dict)` to match the design supported in `eval.py` and `eval.sh`, which is anticipated to largely ease the coding process.
**Do not forget to add their references into `args.py` for the convenience of usage.**
You can execute `model_tester.py` in the `eval` folder to check the correctness of you implementation. Various problems including implementation errors, small bugs in code, and even wrong environment settings may cause failure of the evaluation. The examples provided in the file cover most kinds of cases presented in our benchmark. Feel free to change the code in it to debug your code😊
```shell
python model_tester.py <args> # args are similar to the default settings above
```
### Create Captions and OCR Data for Images
Generate captions or OCR data for images, and save them in csv with format below:
```
../data/images/czls/502_1.png,a cartoon drawing of a man standing in front of a large block
../data/images/czls/525_1.png,a chinese newspaper with the headline, china's new year
...
```
We provide two example scripts to generate captions (`image_caption.py`) and OCR data (`image_ocr.py`) for images.
## 📮 How to Submit
You need to first prepare a UTF-8 encoded JSON file with the following format:
```
{
"czsx_0_0": {
"question_id": "czsx_0_0",
"question_image_number": 1,
"image_list": [...], # optional
"input_message": ..., # optional
"prediction": "C"
},
...
}
```
If you evaluate the model with our official code, you can simply zip the prediction file `prediction.json` and the configuration file `args.json` in the experiment results folder `. /results/EXPERIMENT_NAME` in `.zip` format.
Then, you can submit your result to our [evaluation page](https://opendfm.github.io/MULTI-Benchmark/static/pages/submit.html).
You are also welcomed to pull a request and contribute your code to our evaluation code. We will be very grateful for your contribution!
**[Notice]** Thank you for being so interested in the **MULTI** dataset! If you want to add your model in our leaderboard, please fill in [this questionnaire](https://wj.sjtu.edu.cn/q/89UmRAJn), your information will be kept strictly confidential, so please feel free to fill it out. 🤗
## 📑 Citation
If you find our work useful, please cite us!
```
@misc{zhu2024multi,
title={{MULTI}: Multimodal Understanding Leaderboard with Text and Images},
author={Zichen Zhu and Yang Xu and Lu Chen and Jingkai Yang and Yichuan Ma and Yiming Sun and Hailin Wen and Jiaqi Liu and Jinyu Cai and Yingzi Ma and Situo Zhang and Zihan Zhao and Liangtai Sun and Kai Yu},
year={2024},
eprint={2402.03173},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## 📧 Contact Us
If you have any questions, please feel free to contact us via email `JamesZhutheThird@sjtu.edu.cn` and `xuyang0112@sjtu.edu.cn`
|
open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2 | ---
pretty_name: Evaluation run of binbi/SF-72B-V1.8.6-V1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [binbi/SF-72B-V1.8.6-V1.2](https://huggingface.co/binbi/SF-72B-V1.8.6-V1.2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T20:26:22.258506](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2/blob/main/results_2024-01-21T20-26-22.258506.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2312183583064229,\n\
\ \"acc_stderr\": 0.029963667974972664,\n \"acc_norm\": 0.2311618522242625,\n\
\ \"acc_norm_stderr\": 0.030751973434955327,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n\
\ \"mc2_stderr\": 0.016318959342538\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202605,\n\
\ \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25801633140808605,\n\
\ \"acc_stderr\": 0.004366488167386393,\n \"acc_norm\": 0.24865564628560047,\n\
\ \"acc_norm_stderr\": 0.004313503876346078\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.02590789712240817,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.02590789712240817\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371376,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371376\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.02910522083322462,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.02910522083322462\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21711366538952745,\n\
\ \"acc_stderr\": 0.014743125394823295,\n \"acc_norm\": 0.21711366538952745,\n\
\ \"acc_norm_stderr\": 0.014743125394823295\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n\
\ \"mc2_stderr\": 0.016318959342538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076906\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/binbi/SF-72B-V1.8.6-V1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|arc:challenge|25_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|arc:challenge|25_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|gsm8k|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|gsm8k|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hellaswag|10_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hellaswag|10_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T20-13-01.457531.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T20-26-22.258506.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- '**/details_harness|winogrande|5_2024-01-21T20-13-01.457531.parquet'
- split: 2024_01_21T20_26_22.258506
path:
- '**/details_harness|winogrande|5_2024-01-21T20-26-22.258506.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T20-26-22.258506.parquet'
- config_name: results
data_files:
- split: 2024_01_21T20_13_01.457531
path:
- results_2024-01-21T20-13-01.457531.parquet
- split: 2024_01_21T20_26_22.258506
path:
- results_2024-01-21T20-26-22.258506.parquet
- split: latest
path:
- results_2024-01-21T20-26-22.258506.parquet
---
# Dataset Card for Evaluation run of binbi/SF-72B-V1.8.6-V1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [binbi/SF-72B-V1.8.6-V1.2](https://huggingface.co/binbi/SF-72B-V1.8.6-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T20:26:22.258506](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2/blob/main/results_2024-01-21T20-26-22.258506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2312183583064229,
"acc_stderr": 0.029963667974972664,
"acc_norm": 0.2311618522242625,
"acc_norm_stderr": 0.030751973434955327,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4877798130299791,
"mc2_stderr": 0.016318959342538
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202605,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.25801633140808605,
"acc_stderr": 0.004366488167386393,
"acc_norm": 0.24865564628560047,
"acc_norm_stderr": 0.004313503876346078
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.02590789712240817,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.02590789712240817
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371376,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371376
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.02910522083322462,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.02910522083322462
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21711366538952745,
"acc_stderr": 0.014743125394823295,
"acc_norm": 0.21711366538952745,
"acc_norm_stderr": 0.014743125394823295
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4877798130299791,
"mc2_stderr": 0.016318959342538
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076906
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mcanoglu/defect-cwe-grouping | ---
license: mit
---
|
lvkaokao/ld_requests | ---
license: apache-2.0
---
|
origami-digital/fraxtil | ---
license: unknown
---
|
bkai-foundation-models/NewsSapo | ---
task_categories:
- summarization
- feature-extraction
language:
- vi
pretty_name: Vietnamese NewsSapo Dataset
size_categories:
- 10M<n<100M
---
Vietnamese NewsSapo Dataset
The Vietnamese NewsSapo dataset was constructed to train sentence/passage embeddings. Our dataset is structured in a "title-abstract-contents" format, where each news article is represented by a tuple of (title, abstract, content). The content is the main text body of the article and has been processed to remove images, videos, and other non-textual elements. The dataset contains 31,728,183 triples.
To build this dataset, we followed a two-step process:
Step 1: Collect news data from 2021-11/2023. Combine with [Binhvq News Corpus](https://github.com/binhvq/news-corpus) to form a unified dataset.
Step 2: Extract title-sapo-content for each article.
### Please cite our manuscript if this dataset is used for your work
```
@article{duc2024towards,
title={Towards Comprehensive Vietnamese Retrieval-Augmented Generation and Large Language Models},
author={Nguyen Quang Duc, Le Hai Son, Nguyen Duc Nhan, Nguyen Dich Nhat Minh, Le Thanh Huong, Dinh Viet Sang},
journal={arXiv preprint arXiv:2403.01616},
year={2024}
}
``` |
korexyz/unsplash-people-v3 | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1152867.0
num_examples: 4500
download_size: 314820
dataset_size: 1152867.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca | ---
pretty_name: Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca](https://huggingface.co/HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T03:36:26.320528](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca/blob/main/results_2024-01-08T03-36-26.320528.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6236571632946254,\n\
\ \"acc_stderr\": 0.03222013618820489,\n \"acc_norm\": 0.6309524776297984,\n\
\ \"acc_norm_stderr\": 0.03288521739348617,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.41422968964840373,\n\
\ \"mc2_stderr\": 0.014212709995879808\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.014611199329843784,\n\
\ \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866975\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5616411073491336,\n\
\ \"acc_stderr\": 0.004951717622007979,\n \"acc_norm\": 0.7530372435769767,\n\
\ \"acc_norm_stderr\": 0.0043036354511158045\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.025620857042936655,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.025620857042936655\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02655220782821529,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02655220782821529\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.02466674491518722,\n \
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.02466674491518722\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.01612927102509985,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.01612927102509985\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899094,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899094\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.015961036675230966,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.015961036675230966\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424434,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424434\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n\
\ \"acc_stderr\": 0.012767793787729338,\n \"acc_norm\": 0.49022164276401564,\n\
\ \"acc_norm_stderr\": 0.012767793787729338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.41422968964840373,\n\
\ \"mc2_stderr\": 0.014212709995879808\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.01210836530743752\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2835481425322214,\n \
\ \"acc_stderr\": 0.012415070917508127\n }\n}\n```"
repo_url: https://huggingface.co/HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|arc:challenge|25_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|arc:challenge|25_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|gsm8k|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|gsm8k|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hellaswag|10_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hellaswag|10_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T21-59-12.253105.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T03-36-26.320528.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- '**/details_harness|winogrande|5_2024-01-07T21-59-12.253105.parquet'
- split: 2024_01_08T03_36_26.320528
path:
- '**/details_harness|winogrande|5_2024-01-08T03-36-26.320528.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T03-36-26.320528.parquet'
- config_name: results
data_files:
- split: 2024_01_07T21_59_12.253105
path:
- results_2024-01-07T21-59-12.253105.parquet
- split: 2024_01_08T03_36_26.320528
path:
- results_2024-01-08T03-36-26.320528.parquet
- split: latest
path:
- results_2024-01-08T03-36-26.320528.parquet
---
# Dataset Card for Evaluation run of HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca](https://huggingface.co/HenryJJ/Instruct_Yi-6B_Dolly_CodeAlpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T03:36:26.320528](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__Instruct_Yi-6B_Dolly_CodeAlpaca/blob/main/results_2024-01-08T03-36-26.320528.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6236571632946254,
"acc_stderr": 0.03222013618820489,
"acc_norm": 0.6309524776297984,
"acc_norm_stderr": 0.03288521739348617,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.41422968964840373,
"mc2_stderr": 0.014212709995879808
},
"harness|arc:challenge|25": {
"acc": 0.5025597269624573,
"acc_stderr": 0.014611199329843784,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866975
},
"harness|hellaswag|10": {
"acc": 0.5616411073491336,
"acc_stderr": 0.004951717622007979,
"acc_norm": 0.7530372435769767,
"acc_norm_stderr": 0.0043036354511158045
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.025620857042936655,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.025620857042936655
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02655220782821529,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02655220782821529
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509985,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509985
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899094,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899094
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.015961036675230966,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.015961036675230966
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424434,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424434
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729338,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.41422968964840373,
"mc2_stderr": 0.014212709995879808
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.01210836530743752
},
"harness|gsm8k|5": {
"acc": 0.2835481425322214,
"acc_stderr": 0.012415070917508127
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
deepapaikar/YU_QA_set | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1697714
num_examples: 6714
download_size: 721284
dataset_size: 1697714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2 | ---
pretty_name: Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kyujinpy/Sakura-SOLAR-Instruct-DPO-v2](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T22:39:58.895628](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2/blob/main/results_2023-12-29T22-39-58.895628.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6682468010299201,\n\
\ \"acc_stderr\": 0.031550102562656,\n \"acc_norm\": 0.6692469699297998,\n\
\ \"acc_norm_stderr\": 0.03219064838817908,\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7185849753394141,\n\
\ \"mc2_stderr\": 0.014985704637518712\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907593\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7136028679545907,\n\
\ \"acc_stderr\": 0.004511533039406214,\n \"acc_norm\": 0.8840868352917746,\n\
\ \"acc_norm_stderr\": 0.003194665266078602\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736413,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736413\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236786,\n\
\ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236786\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"\
acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568624,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n\
\ \"acc_stderr\": 0.01639971673284714,\n \"acc_norm\": 0.4022346368715084,\n\
\ \"acc_norm_stderr\": 0.01639971673284714\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445803,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445803\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n\
\ \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n\
\ \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352817,\n \"\
acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352817\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857834,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857834\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7185849753394141,\n\
\ \"mc2_stderr\": 0.014985704637518712\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370632\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6376042456406369,\n \
\ \"acc_stderr\": 0.013240654263574762\n }\n}\n```"
repo_url: https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|arc:challenge|25_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|gsm8k|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hellaswag|10_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T22-39-58.895628.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- '**/details_harness|winogrande|5_2023-12-29T22-39-58.895628.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T22-39-58.895628.parquet'
- config_name: results
data_files:
- split: 2023_12_29T22_39_58.895628
path:
- results_2023-12-29T22-39-58.895628.parquet
- split: latest
path:
- results_2023-12-29T22-39-58.895628.parquet
---
# Dataset Card for Evaluation run of kyujinpy/Sakura-SOLAR-Instruct-DPO-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kyujinpy/Sakura-SOLAR-Instruct-DPO-v2](https://huggingface.co/kyujinpy/Sakura-SOLAR-Instruct-DPO-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:39:58.895628](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__Sakura-SOLAR-Instruct-DPO-v2/blob/main/results_2023-12-29T22-39-58.895628.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6682468010299201,
"acc_stderr": 0.031550102562656,
"acc_norm": 0.6692469699297998,
"acc_norm_stderr": 0.03219064838817908,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7185849753394141,
"mc2_stderr": 0.014985704637518712
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907593
},
"harness|hellaswag|10": {
"acc": 0.7136028679545907,
"acc_stderr": 0.004511533039406214,
"acc_norm": 0.8840868352917746,
"acc_norm_stderr": 0.003194665266078602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736413,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736413
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236786,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236786
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603347,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603347
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568624,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4022346368715084,
"acc_stderr": 0.01639971673284714,
"acc_norm": 0.4022346368715084,
"acc_norm_stderr": 0.01639971673284714
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341062,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341062
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445803,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352817,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352817
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857834,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857834
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7185849753394141,
"mc2_stderr": 0.014985704637518712
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370632
},
"harness|gsm8k|5": {
"acc": 0.6376042456406369,
"acc_stderr": 0.013240654263574762
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
OneSimmer/Galisteu | ---
license: openrail
---
|
ohmno2/any-_amu | ---
license: apache-2.0
---
|
NbAiLab/norwegian_parliament | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- no
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
---
# Dataset Card Creation Guide
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** N/A
- **Repository:** [GitHub](https://github.com/ltgoslo/NorBERT/)
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** -
### Dataset Summary
The Norwegian Parliament Speeches is a collection of text passages from 1998 to 2016 and pronounced at the Norwegian Parliament (Storting) by members of the two major parties: Fremskrittspartiet and Sosialistisk Venstreparti. The dataset is annotated with the party the speaker was associated with at the time (dates of speeches are also included).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The text in the dataset is in Norwegian.
## Dataset Structure
### Data Instances
Example of one instance in the dataset.
```{'label': 0, 'text': 'Verre er det med slagsmålene .'}```
### Data Fields
- `id`: index of the example
- `text`: Text of a speech
- `date`: Date (`YYYY-MM-DD`) the speech was produced
- `label`: Political party the speaker was associated with at the time
- 0 = Fremskrittspartiet
- 1 = Sosialistisk Venstreparti
### Data Splits
The dataset is split into a `train`, `validation`, and `test` split with the following sizes:
| | Tain | Valid | Test |
| ----- | ------ | ----- | ----- |
| Number of examples | 3600 | 1200 | 1200 |
The dataset is balanced on political party.
## Dataset Creation
This dataset is based on the publicly available information by Norwegian Parliament (Storting) and created by the National Library of Norway AI-Lab to benchmark their language models.
## Additional Information
### Licensing Information
This work is licensed under a Creative Commons Attribution 4.0 International License
### Citation Information
```latex
@misc{--,
title={--},
author={--},
year={2021},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
The-F00L/classeurcsv | ---
license: mit
---
|
MU-NLPC/Calc-ape210k_selftrain_experiment_negative | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: question_chinese
dtype: string
- name: chain
dtype: string
- name: result
dtype: string
- name: result_float
dtype: float64
- name: equation
dtype: string
- name: model_checkpoint
dtype: string
- name: prediction
dtype: string
splits:
- name: train
num_bytes: 43570564
num_examples: 48194
download_size: 12441464
dataset_size: 43570564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Calc-ape210k_selftrain_experiment_prompted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
galman33/gal_yair_8300_256x256_fixed | ---
dataset_info:
features:
- name: lat
dtype: float64
- name: lon
dtype: float64
- name: country_code
dtype:
class_label:
names:
'0': ad
'1': ae
'2': al
'3': aq
'4': ar
'5': au
'6': bd
'7': be
'8': bg
'9': bm
'10': bo
'11': br
'12': bt
'13': bw
'14': ca
'15': ch
'16': cl
'17': co
'18': cz
'19': de
'20': dk
'21': ec
'22': ee
'23': es
'24': fi
'25': fr
'26': gb
'27': gh
'28': gl
'29': gr
'30': gt
'31': hk
'32': hr
'33': hu
'34': id
'35': ie
'36': il
'37': is
'38': it
'39': ix
'40': jp
'41': kg
'42': kh
'43': kr
'44': la
'45': lk
'46': ls
'47': lt
'48': lu
'49': lv
'50': me
'51': mg
'52': mk
'53': mn
'54': mo
'55': mt
'56': mx
'57': my
'58': nl
'59': 'no'
'60': nz
'61': pe
'62': ph
'63': pl
'64': pt
'65': ro
'66': rs
'67': ru
'68': se
'69': sg
'70': si
'71': sk
'72': sn
'73': sz
'74': th
'75': tn
'76': tr
'77': tw
'78': ua
'79': ug
'80': us
'81': uy
'82': za
- name: image
dtype: image
splits:
- name: train
num_bytes: 805028017.5
num_examples: 8300
download_size: 804437967
dataset_size: 805028017.5
---
# Dataset Card for "gal_yair_8300_256x256_fixed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_yall | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 9631
num_examples: 63
- name: test
num_bytes: 20510
num_examples: 133
- name: train
num_bytes: 271614
num_examples: 2360
download_size: 155237
dataset_size: 301755
---
# Dataset Card for "MULTI_VALUE_sst2_yall"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai-research-id/wikihow-th | ---
license: cc
---
|
M-A-D/Mixed-Arabic-Dataset-Main-Test | ---
dataset_info:
features:
- name: GenId
dtype: int64
- name: SubId
dtype: int64
- name: DatasetName
dtype: string
- name: DatasetLink
dtype: string
- name: Text
dtype: string
- name: MetaData
struct:
- name: AboutAuthor
dtype: 'null'
- name: AboutBook
dtype: 'null'
- name: Author
dtype: 'null'
- name: AuthorName
dtype: 'null'
- name: BookLink
dtype: 'null'
- name: BookName
dtype: 'null'
- name: ChapterLink
dtype: 'null'
- name: ChapterName
dtype: 'null'
- name: Tags
dtype: 'null'
- name: __index_level_0__
dtype: float64
- name: created_date
dtype: string
- name: deleted
dtype: bool
- name: detoxify
dtype: 'null'
- name: emojis
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: id
dtype: string
- name: labels
struct:
- name: count
sequence: int32
- name: name
sequence: string
- name: value
sequence: float64
- name: lang
dtype: string
- name: message_id
dtype: string
- name: message_tree_id
dtype: string
- name: model_name
dtype: 'null'
- name: parent_id
dtype: string
- name: query_id
dtype: 'null'
- name: rank
dtype: float64
- name: review_count
dtype: float64
- name: review_result
dtype: bool
- name: role
dtype: string
- name: synthetic
dtype: bool
- name: title
dtype: 'null'
- name: tree_state
dtype: string
- name: url
dtype: 'null'
- name: user_id
dtype: string
- name: ConcatenatedText
dtype: int64
splits:
- name: train
num_bytes: 96491917
num_examples: 71935
download_size: 37192033
dataset_size: 96491917
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Mixed-Arabic-Dataset-Main-Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NorGLM/NO-QNLI | ---
license: unknown
language:
- 'no'
---
# Dataset Card for NO-QNLI
## Dataset Summary
NO-QNLI is machine translated from the Stanford Question Answering Dataset containing question-paragraph pairs. The question is written by human annotator and the paragraph is dran from Wikipedia consisting the answer to the question.
This dataset belongs to NLEBench Norwegian benchmarks for evaluation on Norwegian Natrual Language Undersanding (NLU) tasks.
## Data Split
The dataset is split into train, val and test sets sourced from it original distributions. More information can refer to [link](https://huggingface.co/datasets/nyu-mll/glue).
## Licensing Information
This dataset is built upon the existing datasets. We therefore follow its original license information.
## Citation Information
We encourage to cite the GLUE benchmark:
```
@inproceedings{wang2019glue,
title={{GLUE}: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding},
author={Wang, Alex and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel R.},
note={In the Proceedings of ICLR.},
year={2019}
}
```
|
gg-ai/es-2111-no-demoji-hashtag-m | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
- name: clean_text
dtype: string
- name: sent
dtype: int64
splits:
- name: train
num_bytes: 9488395
num_examples: 23119
- name: test
num_bytes: 1405379
num_examples: 3467
- name: val
num_bytes: 240263
num_examples: 612
download_size: 0
dataset_size: 11134037
---
# Dataset Card for "es-2111-no-demoji-hashtag-m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/ficbook_raw_best_10k | ---
dataset_info:
features:
- name: id
dtype: string
- name: author
dtype: string
- name: title
dtype: string
- name: link
dtype: string
- name: description
dtype: string
- name: tag
dtype: string
- name: likes
dtype: int64
- name: date
dtype: string
- name: review
dtype: string
- name: format
dtype: string
- name: text
dtype: string
- name: rating
dtype: string
- name: status
dtype: string
- name: parts
dtype: string
splits:
- name: train
num_bytes: 91515293.63435334
num_examples: 10000
download_size: 101345356
dataset_size: 91515293.63435334
---
# Dataset Card for "ficbook_raw_best_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GAIR/ReAlign-No-Robots | ---
task_categories:
- question-answering
- conversational
language:
- en
size_categories:
- 1K<n<10K
---
Please refer to our [GitHub repo](https://github.com/GAIR-NLP/ReAlign) for more details. |
CyberHarem/neeko_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of neeko (League of Legends)
This is the dataset of neeko (League of Legends), containing 180 images and their tags.
The core tags of this character are `hair_ornament, hair_flower, colored_skin, blue_hair, multicolored_hair, bangs, green_skin, medium_hair, yellow_eyes, purple_hair, tail, breasts, slit_pupils, pink_hair, monster_girl`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 180 | 263.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/neeko_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 180 | 129.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/neeko_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 437 | 291.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/neeko_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 180 | 222.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/neeko_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 437 | 452.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/neeko_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/neeko_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, orange_eyes, pink_flower, shiny_hair, solo, collarbone, large_breasts, bare_shoulders, lizard_tail, shiny_skin, fang, flipped_hair, heart, navel, on_back, open_mouth, :d, bed_sheet, black_bikini, cleavage, knees_up, nipples, nude, tongue_out |
| 1 | 19 |  |  |  |  |  | 1girl, solo, bare_shoulders, pink_flower, necklace, simple_background, white_background, looking_at_viewer, long_hair, teeth, upper_body, :d, open_mouth, shiny_hair, blush, hand_up |
| 2 | 6 |  |  |  |  |  | 1girl, bare_shoulders, flower, solo, artist_name, butterfly, eyeshadow, necklace, eyelashes, looking_at_viewer, parted_lips, pink_lips, cleavage, flipped_hair, lipstick, long_hair, nature, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | orange_eyes | pink_flower | shiny_hair | solo | collarbone | large_breasts | bare_shoulders | lizard_tail | shiny_skin | fang | flipped_hair | heart | navel | on_back | open_mouth | :d | bed_sheet | black_bikini | cleavage | knees_up | nipples | nude | tongue_out | necklace | simple_background | white_background | long_hair | teeth | upper_body | hand_up | flower | artist_name | butterfly | eyeshadow | eyelashes | parted_lips | pink_lips | lipstick | nature |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------------|:--------------|:-------------|:-------|:-------------|:----------------|:-----------------|:--------------|:-------------|:-------|:---------------|:--------|:--------|:----------|:-------------|:-----|:------------|:---------------|:-----------|:-----------|:----------|:-------|:-------------|:-----------|:--------------------|:-------------------|:------------|:--------|:-------------|:----------|:---------|:--------------|:------------|:------------|:------------|:--------------|:------------|:-----------|:---------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | | X | X | X | | | X | | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | | | X | | | X | | | | X | | | | | | | | X | | | | | X | | | X | | X | | X | X | X | X | X | X | X | X | X |
|
strombergnlp/broad_twitter_corpus | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: broad-twitter-corpus
pretty_name: Broad Twitter Corpus
---
# Dataset Card for broad_twitter_corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [https://github.com/GateNLP/broad_twitter_corpus](https://github.com/GateNLP/broad_twitter_corpus)
- **Repository:** [https://github.com/GateNLP/broad_twitter_corpus](https://github.com/GateNLP/broad_twitter_corpus)
- **Paper:** [http://www.aclweb.org/anthology/C16-1111](http://www.aclweb.org/anthology/C16-1111)
- **Leaderboard:** [Named Entity Recognition on Broad Twitter Corpus](https://paperswithcode.com/sota/named-entity-recognition-on-broad-twitter)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
### Dataset Summary
This is the Broad Twitter corpus, a dataset of tweets collected over stratified times, places and social uses. The goal is to represent a broad range of activities, giving a dataset more representative of the language used in this hardest of social media formats to process. Further, the BTC is annotated for named entities.
See the paper, [Broad Twitter Corpus: A Diverse Named Entity Recognition Resource](http://www.aclweb.org/anthology/C16-1111), for details.
### Supported Tasks and Leaderboards
* Named Entity Recognition
* On PWC: [Named Entity Recognition on Broad Twitter Corpus](https://paperswithcode.com/sota/named-entity-recognition-on-broad-twitter)
### Languages
English from UK, US, Australia, Canada, Ireland, New Zealand; `bcp47:en`
## Dataset Structure
### Data Instances
Feature |Count
---|---:
Documents |9 551
Tokens |165 739
Person entities |5 271
Location entities |3 114
Organization entities |3 732
### Data Fields
Each tweet contains an ID, a list of tokens, and a list of NER tags
- `id`: a `string` feature.
- `tokens`: a `list` of `strings`
- `ner_tags`: a `list` of class IDs (`int`s) representing the NER class:
```
0: O
1: B-PER
2: I-PER
3: B-ORG
4: I-ORG
5: B-LOC
6: I-LOC
```
### Data Splits
Section|Region|Collection period|Description|Annotators|Tweet count
---|---|---|---|---|---:
A | UK| 2012.01| General collection |Expert| 1000
B |UK |2012.01-02 |Non-directed tweets |Expert |2000
E |Global| 2014.07| Related to MH17 disaster| Crowd & expert |200
F |Stratified |2009-2014| Twitterati |Crowd & expert |2000
G |Stratified| 2011-2014| Mainstream news| Crowd & expert| 2351
H |Non-UK| 2014 |General collection |Crowd & expert |2000
The most varied parts of the BTC are sections F and H. However, each of the remaining four sections has some specific readily-identifiable bias. So, we propose that one uses half of section H for evaluation and leaves the other half in the training data. Section H should be partitioned in the order of the JSON-format lines. Note that the CoNLL-format data is readily reconstructible from the JSON format, which is the authoritative data format from which others are derived.
**Test**: Section F
**Development**: Section H (the paper says "second half of Section H" but ordinality could be ambiguous, so it all goes in. Bonne chance)
**Training**: everything else
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Creative Commons Attribution 4.0 International (CC BY 4.0)
### Citation Information
```
@inproceedings{derczynski2016broad,
title={Broad twitter corpus: A diverse named entity recognition resource},
author={Derczynski, Leon and Bontcheva, Kalina and Roberts, Ian},
booktitle={Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers},
pages={1169--1179},
year={2016}
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
kobe4cn/test | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.