datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
uname-n/slim-orca-dedup-chat | ---
dataset_info:
features:
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 602329260
num_examples: 363491
download_size: 301939359
dataset_size: 602329260
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bhjhk/masmamad8 | ---
license: creativeml-openrail-m
---
|
AdapterOcean/med_alpaca_standardized_cluster_86 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 51349063
num_examples: 5266
download_size: 15035670
dataset_size: 51349063
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_86"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/SpeechDetection_LJSpeech | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 58016282.72519084
num_examples: 200
download_size: 56990484
dataset_size: 58016282.72519084
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "speechDetection_LJSpeech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
myrtotsok/clf-3 | ---
dataset_info:
features:
- name: request
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 121051
num_examples: 1120
- name: validation
num_bytes: 30256
num_examples: 280
download_size: 28195
dataset_size: 151307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
FINNUMBER/FINCH_TRAIN_ALL_900_per100_NEW_Rationale | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: sub_task
dtype: string
- name: rationale
dtype: string
- name: correct
dtype: bool
- name: check
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3764071
num_examples: 900
download_size: 2070353
dataset_size: 3764071
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/stg44_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of stg44/StG44/StG44 (Girls' Frontline)
This is the dataset of stg44/StG44/StG44 (Girls' Frontline), containing 114 images and their tags.
The core tags of this character are `blonde_hair, long_hair, green_eyes, hat, bangs, military_hat, black_headwear, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 114 | 158.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stg44_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 114 | 82.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stg44_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 293 | 187.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stg44_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 114 | 136.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stg44_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 293 | 270.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stg44_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stg44_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, black_necktie, military_uniform, solo, looking_at_viewer, white_gloves, white_shirt, black_jacket, smile, upper_body, simple_background, white_background, long_sleeves, closed_mouth, open_mouth |
| 1 | 15 |  |  |  |  |  | 1girl, black_necktie, black_skirt, looking_at_viewer, military_uniform, solo, white_shirt, assault_rifle, white_gloves, black_jacket, closed_mouth, holding_gun, black_thighhighs, boots, collared_shirt, black_footwear, military_jacket, belt, long_sleeves, smile, full_body, peaked_cap, simple_background, standing |
| 2 | 5 |  |  |  |  |  | 1girl, black_pantyhose, black_skirt, eyewear_on_head, high_heels, solo, sunglasses, white_shirt, black_footwear, full_body, looking_at_viewer, medium_breasts, pencil_skirt, black_choker, black_jacket, holding, legs, torn_pantyhose, alternate_costume, black_coat, blush, jacket_on_shoulders, long_sleeves, office_lady, shoes, sitting, standing_on_one_leg, tinted_eyewear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_necktie | military_uniform | solo | looking_at_viewer | white_gloves | white_shirt | black_jacket | smile | upper_body | simple_background | white_background | long_sleeves | closed_mouth | open_mouth | black_skirt | assault_rifle | holding_gun | black_thighhighs | boots | collared_shirt | black_footwear | military_jacket | belt | full_body | peaked_cap | standing | black_pantyhose | eyewear_on_head | high_heels | sunglasses | medium_breasts | pencil_skirt | black_choker | holding | legs | torn_pantyhose | alternate_costume | black_coat | blush | jacket_on_shoulders | office_lady | shoes | sitting | standing_on_one_leg | tinted_eyewear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:-------------------|:-------|:--------------------|:---------------|:--------------|:---------------|:--------|:-------------|:--------------------|:-------------------|:---------------|:---------------|:-------------|:--------------|:----------------|:--------------|:-------------------|:--------|:-----------------|:-----------------|:------------------|:-------|:------------|:-------------|:-----------|:------------------|:------------------|:-------------|:-------------|:-----------------|:---------------|:---------------|:----------|:-------|:-----------------|:--------------------|:-------------|:--------|:----------------------|:--------------|:--------|:----------|:----------------------|:-----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | | | X | | | X | | | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
huggingartists/bob-dylan | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/bob-dylan"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 2.91167 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/22306423b6ad8777d1ed5b33ad8b0d0b.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/bob-dylan">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">๐ค HuggingArtists Model ๐ค</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bob Dylan</div>
<a href="https://genius.com/artists/bob-dylan">
<div style="text-align: center; font-size: 14px;">@bob-dylan</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/bob-dylan).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/bob-dylan")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|2241| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/bob-dylan")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
AdapterOcean/med_alpaca_standardized_cluster_30_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 20960368
num_examples: 32918
download_size: 10266852
dataset_size: 20960368
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_30_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlp-with-deeplearning/ko.SHP | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
- question-answering
tags:
- human feedback
- rlhf
- preferences
- reddit
- preference model
- RL
- NLG
- evaluation
size_categories:
- 100K<n<1M
language:
- ko
- en
---
# ๐ข Korean Stanford Human Preferences Dataset (Ko.SHP)
์ด ๋ฐ์ดํฐ์
์ ์์ฒด ๊ตฌ์ถํ ๋ฒ์ญ๊ธฐ๋ฅผ ํ์ฉํ์ฌ [stanfordnlp/SHP](https://huggingface.co/datasets/stanfordnlp/SHP) ๋ฐ์ดํฐ์
์ ๋ฒ์ญํ ๊ฒ์
๋๋ค.
์๋์ ๋ด์ฉ์ ํด๋น ๋ฒ์ญ๊ธฐ๋ก README ํ์ผ์ ๋ฒ์ญํ ๊ฒ์
๋๋ค. ์ฐธ๊ณ ๋ถํ๋๋ฆฝ๋๋ค.
**If you mention this dataset in a paper, please cite the paper:** [Understanding Dataset Difficulty with V-Usable Information (ICML 2022)](https://proceedings.mlr.press/v162/ethayarajh22a.html).
## Summary
SHP๋ ์๋ฆฌ์์ ๋ฒ๋ฅ ์กฐ์ธ์ ์ด๋ฅด๊ธฐ๊น์ง 18๊ฐ์ง ๋ค๋ฅธ ์ฃผ์ ์์ญ์ ์ง๋ฌธ/์ง์นจ์ ๋ํ ์๋ต์ ๋ํ **385K ์ง๋จ ์ธ๊ฐ ์ ํธ๋** ๋ฐ์ดํฐ ์ธํธ์ด๋ค.
๊ธฐ๋ณธ ์ค์ ์ ๋ค๋ฅธ ์๋ต์ ๋ ํ ํ ์๋ต์ ์ ์ฉ์ฑ์ ๋ฐ์ ํ๊ธฐ ์ํ ๊ฒ์ด๋ฉฐ RLHF ๋ณด์ ๋ชจ๋ธ ๋ฐ NLG ํ๊ฐ ๋ชจ๋ธ (์: [SteamSHP](https://huggingface.co/stanfordnlp/SteamSHP-flan-t5-xl))์ ํ๋ จ ํ๋ ๋ฐ ์ฌ์ฉ ํ๋๋ก ์ค์ ๋ฉ๋๋ค.
๊ฐ๊ฐ์ ์๋ ์ง๋ฌธ/์ง์ ๋ฐ ๊ทธ ๊ฒ์๋ฌผ์ ๋ํ ํ ์์ ์ต์์ ์ฝ๋ฉํธ๋ฅผ ๊ฐ๋ ๋ ๋ง ๊ฒ์๋ฌผ์ด๋ฉฐ, ์ฌ๊ธฐ์ ํ๋์ ์ฝ๋ฉํธ๋ (์ข
ํฉ์ ์ผ๋ก) ๋ ๋ง ์ฌ์ฉ์์ ์ํด ๋ ์ ํธ๋๋ค.
SHP๋ ๋๊ธ A๊ฐ ๋๊ธ B ๋ค์ ์์ฑ๋์์ง๋ง ๊ทธ๋ผ์๋ ๋ถ๊ตฌํ๊ณ ์ ์๊ฐ ๋ ๋์ผ๋ฉด ํ๋ฉด์ ์ผ๋ก๋ A๊ฐ B๋ณด๋ค ๋ ์ ํธ๋๋ค๋ ์ฌ์ค์ ์ด์ฉํ๋ค.
A๊ฐ B๋ณด๋ค ๋จผ์ ์์ฑ๋์์ผ๋ฉด ๋ ๋์ ์ ์๊ฐ ๋ ๋ง์ ๊ฐ์์ฑ์ ๊ฒฐ๊ณผ์ผ ์ ์๊ธฐ ๋๋ฌธ์ ๊ฒฐ๋ก ์ ๋ด๋ฆด ์ ์์๋ค.
์ฐ๋ฆฌ๋ ์ ํธ๋ ๋ผ๋ฒจ์ด ์ด๋ค ๋ฐ์์ด ๋ *์ ํด*ํ๊ธฐ๋ณด๋ค๋ ๋ *๋์์ด* ๋๋์ง๋ฅผ ๋ฐ์ํ๊ธฐ ์ํ ๋ฐ์ดํฐ๋ฅผ ์ ํํ์ผ๋ฉฐ ํ์๋ ๋ง์ ๊ณผ๊ฑฐ ์์
์ ์ด์ ์ด๋ค.
SHP๋ [Anthropic์ HH-RLHF ๋ฐ์ดํฐ ์ธํธ](https://huggingface.co/datasets/Anthropic/hh-rlhf)์ ์ด๋ป๊ฒ ๋ค๋ฅธ๊ฐ์?
ํนํ, SHP์ ๋ชจ๋ ๋ฐ์ดํฐ๋ ์์ฐ์ ์ผ๋ก ๋ฐ์ํ๊ณ ์ธ๊ฐ์ด ์์ฑํ๋ ๋ฐ๋ฉด HH-RLHF์ ์๋ต์ ๊ธฐ๊ณ ์์ฑ๋์ด ์๋ก๋ฅผ ๋ณด์ํ ์ ์๋ ๋ ๊ฐ์ง ๋งค์ฐ ๋ค๋ฅธ ๋ถํฌ๋ฅผ ์ ๊ณตํ๋ค.
| Dataset | Size | Input | Label | Domains | Data Format | Length |
| -------------------- | ---- | -------------------------- | ---------------------------- | ------------------------- | ------------------------------------- | --------------- |
| SHP | 385K | ์์ฐ ๋ฐ์ ์ธ๊ฐ ์์ฑ ์๋ต | Collective Human Preference | 18 (labelled) | Question/Instruction + Response (Single-turn) | ์ต๋ 10.1K T5 ํ ํฐ |
| HH-RLHF | 91K | LLM๊ณผ์ ๋ํ | ๊ฐ๋ณ ์ธ๊ฐ ์ ํธ๋ | ๋ ์ด๋ธ์ด ์ง์ ๋์ง ์์ | Live Chat (Multi-turn) | ์ต๋ 1.5K T5 ํ ํฐ |
SHP๋ [ELI5](https://huggingface.co/datasets/eli5#source-data)์ ๊ฐ์ด Reddit์ ์คํฌ๋ํํ ๋ค๋ฅธ ๋ฐ์ดํฐ ์ธํธ์ ์ด๋ป๊ฒ ๋ค๋ฅธ๊ฐ์?
SHP๋ ํ์์คํฌํ ์ ๋ณด๋ฅผ ์ฌ์ฉ ํ ์ฌ ์ ํธ๋๋ฅผ ์ ์ถ ํ๋ ๋ฐ๋ฉด ELI5๋ ์ฃผ์ ๋ฐ ์ ์๋ง ์ ๊ณต ํฉ๋๋ค. ํ์๋ ์ด์ ์ ๋ง๋ ์ฃผ์์ด ๋ ๋ง์ ๊ฐ์์ฑ์์ ๋ ๋์ ์ ์๋ฅผ ์ป๋ ๊ฒฝํฅ์ด ์๊ธฐ ๋๋ฌธ์ ์ ํธ๋๋ฅผ ์ ์ถ ํ๊ธฐ์ ์ถฉ๋ถํ์ง ์์ต๋๋ค.
๋ํ ๋ ๋ง์ ๋๋ฉ์ธ์ ๋ฐ์ดํฐ๋ฅผ ํฌํจํฉ๋๋ค.
| Dataset | Size | Comments + Scores | Preferences | Number of Domains |
| -------------------- | ---- | ------------------ | -------------| ------------------ |
| SHP | 385K | Yes | Yes | 18 |
| ELI5 | 270K | Yes | No | 3 |
## ๋ฐ์ดํฐ ๊ตฌ์กฐ
๊ฐ ํ์ ๋ ๋ง์ ๋ํด ํ๋์ฉ 18๊ฐ์ ๋๋ ํฐ๋ฆฌ๊ฐ ์์ผ๋ฉฐ ๊ฐ ๋๋ ํฐ๋ฆฌ์๋ ํ์ต, ์ ํจ์ฑ ๊ฒ์ฌ ๋ฐ ํ
์คํธ ๋ฐ์ดํฐ๋ฅผ ์ํ JSONL ํ์ผ์ด ํฌํจ๋์ด ์์ต๋๋ค.
Huggingface์ `datasets` ๋ผ์ด๋ธ๋ฌ๋ฆฌ๋ฅผ ์ฌ์ฉํ์ฌ ๋ฐ์ดํฐ๋ฅผ ๊ฐ์ ธ์ค๋ ๋ฐฉ๋ฒ์ ๋ค์๊ณผ ๊ฐ์ต๋๋ค.
```python
from datasets import load_dataset
# Load all the data
dataset = load_dataset("stanfordnlp/shp")
# Load one of the subreddits
dataset = load_dataset("stanfordnlp/shp", data_dir="askculinary")
```
๋ค์์ `askculinary/train.json`์ ์์
๋๋ค.
```
{
`post_id`:"qt3nxl",
`domain`:"askculinary_train",
`upvote_ratio`:0.98,
`history`:"๋ผ์ฆ๋ฒ ๋ฆฌ๋ฅผ ๋ถํดํ๋ ๊ฐ์ฅ ์ข์ ๋ฐฉ๋ฒ์ ๋ฌด์์
๋๊น? ์ด์ ๊ฐ์ด, ๊ทธ๋ฌ๋ ๊ฐ๋ณ ์จ์๊น์ง: https:\/\/i.imgur.com\/Z0c6ZKE.jpg ํ์
์ผ๋ก ๋ถ๋ฆฌํด ์๋๋ฐ ์๊ฐ์ด ๋ง์ด ๊ฑธ๋ฆฝ๋๋ค. ์ด๋ฒ ์ฃผ๋ง๊น์ง ์ฝ 10ํ์ด๋๊ฐ ์์๋ฉ๋๋ค.
`c_root_id_A`:"hkh25sc",
`c_root_id_B`:"hkh25lp",
`created_at_utc_A`:1636822112,
`created_at_utc_B`:1636822110,
`score_A`:340,
`score_B`:166,
`human_ref_A`:"Pectinex, ์๋ง๋? ์
๋ฃฐ๋ก์ค์ค๋ฅผ ๋ถํดํ๋ ํจ์์
๋๋ค. citrus๋ฅผ ์ฌ์ฉํ๋ฉด pectinex์ ๋ฌฝ์ ์ฉ์ก์ ๋ฐค์ ์์ ๊ฒฐํฉ ์กฐ์ง์ ๋ถํดํ๊ฒ ๋ฉ๋๋ค. ๊ฒฐ๊ตญ ์๋ฒฝํ citrus supremes๊ฐ ๋ฉ๋๋ค. ๋ผ์ฆ๋ฒ ๋ฆฌ๋ฅผ ๋ ์งง์ ์๊ฐ ๋์ ์๊ฒ ๋๋ฉด ๊ฐ์ ๋ฐฉ์์ผ๋ก ์ข
์๋ฅผ ๋ถ๋ฆฌํ ์ ์๋์ง ๊ถ๊ธํฉ๋๋ค. ์ฌ๊ธฐ ์์ ๊ฐ ์์ต๋๋ค. https:\/\/www.chefsteps.com\/activities\/perfect-citrus-supreme",
`human_ref_B`:"๋ผ์ฆ๋ฒ ๋ฆฌ ์ฃผ์ค๋ ์ฒ์์๋ ๋ฐ์ ์ผ๋ฃฉ์ ๋ง๋ค์ง๋ง ๋ช ์ฃผ ํ๋ฉด ๊ฑฐ์ ์๋ฌด๊ฒ๋ ์ฌ๋ผ์ง๊ธฐ ์์ํฉ๋๋ค. ์ฒ์ฐ ์ผ๋ฃ ์ธ๊ณ์์ ํ์ฃผ ์ผ๋ฃ๋ก ์๋ ค์ง ๊ฒ์ ์ธํ์ด๋ ๋น์ ๋
ธ์ถ๋์ง ์์๋ ์ฌ๋ผ์ง ๊ฒ์
๋๋ค. ๊ทธ๋
๊ฐ ๋๋ ์ค์ ์ด ์ผ๋ฃฉ์ ๋ฉ์ง ์ฌ์ง์ ๋ง์ด ์ป๊ธฐ๋ฅผ ๋ฐ๋๋๋ค. ๊ณง ๊ทธ๊ฒ์ด ๊ทธ๋
๊ฐ ๋จ๊ธด ์ ๋ถ์ด๊ธฐ ๋๋ฌธ์
๋๋ค."
`labels`:1,
`seconds_difference`:2.0,
`score_ratio`:2.0481927711
}
```
์๊ธฐ ํ๋๋ค์:
- ```post_id```: the ID of the Reddit post (string)
- ```domain```: subreddit and split the example is drawn from, separated by underscore (string)
- ```upvote_ratio```: ๊ธ์ (์ผ๋ช
upvotes) (float) ๊ฒ์๋ฌผ์์ ๋ฐ์ ํฌํ ๋น์จ์
๋๋ค.
- ```history```: Post title concatented to post body (string)
- ```c_root_id_A```: comment A์ ID (string)
- ```c_root_id_B```: comment B (string)์ ID
- ```created_at_utc_A```: utc timestamp of when comment A is created (integer)
- ```created_at_utc_B```: utc timestamp of when comment B is created (integer)
- ```score_A```: (# positive votes - # negative votes + 1) received by comment A (integer)
- ```score_B```: (# positive votes - # negative votes + 1) received by comment B (integer)
- ```human_ref_A```: comment A์ ํ
์คํธ (string)
- ```human_ref_B```: comment B์ ํ
์คํธ (string)
- ```labels```: ์ ํธ๋ ๋ ์ด๋ธ -- A๊ฐ B๋ณด๋ค ์ ํธ๋๋ ๊ฒฝ์ฐ 1์ด๊ณ , B๊ฐ A๋ณด๋ค ์ ํธ๋๋ ๊ฒฝ์ฐ 0์
๋๋ค. ๋ ์ด๋ธ ๋ถํฌ๊ฐ ๋๋ต 50/50์ด ๋๋๋ก ๋ฌด์์ํ๋์์ต๋๋ค. (์ ์)
- ```seconds_difference```: ๋ ์ ํธ๋๋ ์ฝ๋ฉํธ๊ฐ ์์ฑ๋ ํ ๋ช ์ด ํ์ ๋ ์ ํธ๋๋ ์ฝ๋ฉํธ๊ฐ ์์ฑ๋์๋์ง(ํญ์ >= 0์ผ ๊ฒ์) (์ ์)
- ```score_ratio```: ๋ ์ ํธํ๋ ๋๊ธ์ ์ ์์ ๋ ์ ํธํ๋ ๋๊ธ์ ์ ์์ ๋น์จ (>= 1) (float)
## Dataset Design
### ๋๋ฉ์ธ ์ ํ
๋ฐ์ดํฐ๋ *์๋ธ๋ ๋ง* ์ด๋ผ๋ ํ ํฝ๋ณ ํฌ๋ผ๋ก ๊ตฌ์ฑ๋ ๊ณต๊ฐ ํฌ๋ผ์ธ Reddit์์ ๊ณต๊ธ๋ฉ๋๋ค.
์๋ฅผ ๋ค์ด `askculinary` ํ์ ๋ ๋ง์ ์ฌ์ฉ์๊ฐ ์๋ฆฌ ๊ด๋ จ ์ง๋ฌธ์ ํ๊ณ ๋ค๋ฅธ ์ฌ์ฉ์์ ์ํด ์๋ต ๋๋ ๊ฒ์
๋๋ค.
SHP์๋ 18๊ฐ์ ๋ค๋ฅธ ํ์ ๋ ๋ง์์ ๊ธ์ด๋ธ ์ฃผ์์ ๋ํ ์ด์ฐจ, ์ ํจ์ฑ ๊ฒ์ฌ ๋ฐ ํ
์คํธ ๋ถํ ์ด ํฌํจ๋์ด ์์ต๋๋ค. ์ฐ๋ฆฌ๋ ๋ค์์ ๊ธฐ๋ฐ์ผ๋ก ํ์ ๋ ๋ง์ ์ ํํ์ต๋๋ค.
1. ์ ์๋ ค์ง ๊ฒ์ธ์ง ์ฌ๋ถ(๊ฐ์
์์ >= 100K)
2. ๊ฒ์๋ฌผ์ด ์ง๋ฌธ ๋๋ ์ง์๋ฅผ ๋ด๋ฆด ๊ฒ์ผ๋ก ์์๋์๋์ง ์ฌ๋ถ
3. ์๋ต์ด ์ผ๋ง๋ *๋์์ด* ๋๋์ง์ ๋ฐ๋ผ ํ๊ฐ๋๋์ง ์ฌ๋ถ
4. ์ฝ๋ฉํธ๊ฐ ์ ์ ์ผ๋ก ๊ฐ์ธ ๊ฒฝํ์ ๋ํ ๊ฒ์ด ์๋๋ผ ์ผ๋ถ ๊ฐ๊ด์ฑ์ ๋ฟ๋ฆฌ๋ฅผ ๋์ด์ผ ํ๋์ง ์ฌ๋ถ(์: `askscience` ๋ `AskAmericans`)์
๋๋ค.
์ด์ฐจ/๊ฒ์ฆ/ํ
์คํธ ๋ถํ ์ ํ์ ๋ ๋ง์ ํฌ์คํธ ID๋ฅผ ๊ฐ๊ฐ 90%/5%/5% ๋น์จ๋ก ๋ถํ ํ์ฌ ์์ฑ๋์ด ์ฌ๋ฌ ๋ถํ ์ ํฌ์คํธ๊ฐ ๋ํ๋์ง ์๋๋ค.
์์ดํ ๊ฒ์๋ฌผ๋ค์ ์์ดํ ์์ ์ฝ๋ฉํธ๋ค์ ๊ฐ๊ธฐ ๋๋ฌธ์, ๊ฐ๊ฐ์ ๋ถํ ์์์ ์ ํธ๋ค์ ์๋ ์ ํํ 90%/5%/5%๊ฐ ์๋๋ค:
| subreddit | train | validation | test | total |
| ------------------ | -------: | ---------: | ---: | ----: |
| askacademia | 31450 | 2095 | 1708 | 35253 |
| askanthropology | 3910 | 203 | 268 | 4381 |
| askbaking | 44007 | 2096 | 1544 | 47647 |
| askcarguys | 3227 | 159 | 117 | 3503 |
| askculinary | 45710 | 2094 | 2563 | 50367 |
| askdocs | 6449 | 315 | 455 | 7219 |
| askengineers | 57096 | 3154 | 2638 | 62888 |
| askhistorians | 3264 | 113 | 164 | 3541 |
| askhr | 8295 | 641 | 395 | 9331 |
| askphilosophy | 10307 | 608 | 677 | 11592 |
| askphysics | 7364 | 409 | 587 | 8360 |
| askscience | 13316 | 899 | 977 | 15192 |
| asksciencefiction | 29382 | 1576 | 1987 | 32945 |
| asksocialscience | 2706 | 147 | 188 | 3041 |
| askvet | 3300 | 170 | 224 | 3694 |
| changemyview | 38173 | 1637 | 1836 | 41646 |
| explainlikeimfive | 19592 | 1014 | 1070 | 21676 |
| legaladvice | 21170 | 1106 | 1011 | 23287 |
| ALL | 348718 | 18436 | 18409 | 385563 |
### ๋ฐ์ดํฐ ์ ํ
ํฌ์คํธ/๋๊ธ์ ์ ์๋ 1์ ์ฌ์ฉ์๋ก๋ถํฐ์ ์ํฅ ํฌํ ์(์น์ธ)๋ฅผ ๊ณฑํ๊ณ ํํฅ ํฌํ ์(์น์ธ ์ทจ์)๋ฅผ ๋บ ๊ฐ์
๋๋ค.
์ ์์ ๊ฐ์ ์๋์ ์
๋๋ค. ํธ๋ํฝ์ด ๋ง์ ํ์ ๋ ๋ง(๊ฒ์๋ฌผ)์์๋ ์ ์๊ฐ ๋์ ๊ฒ์๋ฌผ(๋๊ธ)์ด ๋ ๋ง์ต๋๋ค.
๊ฒ์๋ฌผ์์ ๋ ์ผ์ฐ ๊ฒ์๋ ๋๊ธ์ ๋จ์ํ ๋
ธ์ถ์ด ๋ง์ ์ ์๊ฐ ๋ ๋์ ๊ฒฝํฅ์ด ์์ ๊ฒ์ด๋ฏ๋ก ์ ํธ๋๋ฅผ ์ถ๋ก ํ ๋ ํ์์คํฌํ ์ ๋ณด๋ฅผ ์ฌ์ฉํ๋ ๊ฒ์ด ํ์์ ์ด๋ค.
๊ฒ์๋ฌผ P์ ๋ ๊ฐ์ ์ฃผ์(A,B)์ด ์ฃผ์ด์ง๋ฉด ๋ฐ์ดํฐ ์ธํธ์ ์ ํธ๋ A > B๋ง ํฌํจํ๋ค.
1. A๋ *๋ฆ์ง ์๊ฒ* B๋ก ์์ฑ๋์๊ณ A๋ B๋ณด๋ค ๋์ ์ ์๋ฅผ ๊ฐ๋๋ค.
2. ๊ฒ์๋ฌผ์ 2023๋
์ด์ ์ ๋ง๋ค์ด์ง ์
ํ-ํฌ์คํธ(์ฆ, ํ
์คํธ์ ๋ณธ๋ฌธ์ด๊ณ ๋ค๋ฅธ ํ์ด์ง๋ก์ ๋งํฌ๊ฐ ์๋)์ด๋ฉฐ, ํธ์ง๋์ง ์์์ผ๋ฉฐ, NSFW(18 ์ด๊ณผ)๊ฐ ์๋๋ค.
3. ์ญ์ ๋ ์ฌ์ฉ์, ์ฌํ์ ๋๋ ๊ฒ์๋ฌผ ์์ฑ์์ ์ํด ์ด๋ ํ ์ฝ๋ฉํธ๋ ์ด๋ฃจ์ด์ง์ง ์์๋ค. ๊ฒ์๋ฌผ์ ์ญ์ ๋ ์ฌ์ฉ์ ๋๋ ์งํ์๊ฐ ๋ง๋ค์ง ์์์ต๋๋ค.
4. ๊ฒ์๋ฌผ์ ์ ์๊ฐ >=10์ด๊ณ ๊ฐ ์ฝ๋ฉํธ๋ ์ ์๊ฐ >=2(์ ์ด๋ ํ ๋ฒ ์ด์ ํฌํ)์ด๋ค.
์ฃผ์์ด ์๋ ๊ฒ์๋ฌผ์ `n` ๋ฐ์ดํฐ์์ ์ต๋ (`n` `2`) ํ๊ฒฝ ์ค์ ์ ์ ํํ ์ ์์ต๋๋ค.
๊ฒ์๋ฌผ๋น ๋๊ธ ์๋ ํ๋ ํ ๋ฐฐํฌ์ด๊ธฐ ๋๋ฌธ์ ์๋์ ์ผ๋ก ์ ์ ์์ ๊ฒ์๋ฌผ์ด ๋ฐ์ดํฐ๋ฅผ ์ง๋ฐฐํ๋ ๊ฒ์ ๋ฐฉ์งํ๊ธฐ ์ํด ๊ฒ์๋ฌผ๋น 50๊ฐ์ ๋๊ธ์ผ๋ก ์คํฌ๋ํ์ ์ ํํ๋ค.
์ด๋ ์์ ๋ชจ๋ ๊ธฐ์ค์ ์ถฉ์กฑํด์ผ ํ๊ธฐ ๋๋ฌธ์ ์ค์ ๋ก๋ ํจ์ฌ ์ ์ ์์ด์ง๋ง ๊ฐ ๊ฒ์๋ฌผ์ ๋ฐ์ดํฐ ์งํฉ์์ ์ต๋ (`50` `2`๋ฅผ ์ ํ) ์ฃผ์์ด ์์ ์ ์์์ ์๋ฏธ ํฉ๋๋ค.
๋ ๋๋ง์ ์๋ธ๋ ๋๋ง๋ง๋ค ์์ 1000๊ฐ ์ด์์ ๊ฒ์๋ฌผ์ ์ป๋ ๊ฒ์ ๋งค์ฐ ์ด๋ ต๊ฒ ๋ง๋ ๋ค.
์ต์์ 1,000๊ฐ์ ๊ฒ์๋ฌผ๋ถํฐ ์์ํ์ฌ Reddit์ ๊ฒ์ ๊ธฐ๋ฅ์ ์ฌ์ฉํ์ฌ ๊ฐ ๊ฒ์๋ฌผ๊ณผ ๊ฐ์ฅ ์ ์ฌํ 25๊ฐ์ ๊ฒ์๋ฌผ์ ๊ฒ์ํ์ฌ ํ์ ๋ ๋ง๋น ์ต๋ 7500๊ฐ์ ๊ณ ์ ํ ๊ฒ์๋ฌผ ID๋ฅผ ์ป์๋ค.
### ์ ์ฒ๋ฆฌ
์ ์ฒ๋ฆฌ๋ฅผ ์ต์ํ์ผ๋ก ์ ์งํ๋ ค๊ณ ๋
ธ๋ ฅํ์ต๋๋ค. ์๋ธ๋ ๋ํธ-ํน์ ์ฝ์ด๋ ํ์ฅ๋์๋ค(์๋ฅผ ๋ค์ด, "CMV"๋ฅผ "๋ด ๊ฒฌํด๋ฅผ ๋ณ๊ฒฝ"์ผ๋ก).
ํ์ดํผ๋งํฌ์์, ์ฐธ์กฐ ํ
์คํธ๋ง์ด ์ ์ง๋๊ณ URL์ด ์ ๊ฑฐ๋์๋ค(URL์ด ๊ธฐ์
๋ ๊ฒฝ์ฐ, ๊ทธ๊ฒ์ ์ ์ง๋์๋ค).
## ๊ธฐ๋ณธ ์ค์ ๋ชจ๋ธ ๋ง๋ค๊ธฐ
### Finetuning
์ธ๊ฐ ์ ํธ๋(์๋ฅผ ๋ค์ด, NLG ํ๊ฐ ๋๋ RLHF ๋ณด์ ๋ชจ๋ธ์ ๋ํด)๋ฅผ ์์ธกํ๊ธฐ ์ํด ๋ชจ๋ธ์ ํผ๋ํ๋ํ๊ณ ์ ํ๋ ๊ฒฝ์ฐ, ์ฌ๊ธฐ ๋ช ๊ฐ์ง ์ ์ฉํ ํ์ด ์๋ค:
1. **๋ฐ์ดํฐ๋ฅผ ์ ์ฒ๋ฆฌํฉ๋๋ค.* * ์ด ์
๋ ฅ ๊ธธ์ด๋ ๋ชจ๋ธ์ ํ ํฐ ์ ํ (์ผ๋ฐ์ ์ผ๋ก 512 ํ ํฐ)์ ์ ํฉ ํด์ผ ํฉ๋๋ค.
FLAN-T5์ ๊ฐ์ ๋ชจ๋ธ์ ์์น ์๋ฒ ๋ฉ์ ์ฌ์ฉํ์ง๋ง 512๊ฐ ํ ํฐ ์ด์์ ์
๋ ฅ์์ ์์ค์ ์กฐ์ ํ๋ฉด ์์ค์ด ์๋ ดํ์ง ์๋๋ค๋ ๊ฒ์ ๋ฐ๊ฒฌํ๋ค.
์ด๋ฅผ ๋ฐฉ์ง ํ๋ ค๋ฉด ๊ฒ์๊ธ ํ
์คํธ (`history` ํ๋์์)๋ฅผ ๊ฐ๋ฅํ ํ ์๋ผ์ ์ ์ฒด ์
๋ ฅ์ด 512 ํ ํฐ ์๋์ ์๋๋ก ํฉ๋๋ค (๊ทธ๋ฌ๋ ์ฃผ์์ ์๋ฆฌ์ง ์์).
์ฌ์ ํ 512 ํ ํฐ ์ด์์ด๋ฉด ์์ ๋ฅผ ๊ฑด๋๋๋๋ค.
2. **์ถฉ๋ถํ ํฐ ๋ชจ๋ธ์ ์ฌ์ฉ** ํฉ๋๋ค.
๋ชจ๋ ํธ๋ ์ด๋ ๋ฐ์ดํฐ์ ๊ฑธ์ณ ๋จ์ผ FLAN-T5-xl ๋ชจ๋ธ์ ํผ๋ํ๋ํ๋ ๊ฒ์ 72-73%(์ ์ฒด ์
๋ ฅ์ด ํ ํฐ ํ๊ณ ๋ด์ ๋ง๋ ์์์ ๋ชจ๋ ๋๋ฉ์ธ์ ๊ฑธ์ณ) ์ฌ์ด์ ํ
์คํธ ์ ํ๋๋ฅผ ์ ๊ณตํด์ผ ํ๋ฉฐ, ๊ฐ๋ณ ์๋ธ๋ ๋ง์ ๊ฒฝ์ฐ 65-80% ๋ฒ์์ด๋ค.
3. **๋๋ฉ์ธ ๋ด ์์ธก์ ์ํ ํฉ๋๋ค.* * ํ์ ๋ ๋ง์ด ๊ด๋ จ์ด ์๋ ๊ฒฝ์ฐ ๋๋ฉ์ธ ์ธ ์ฑ๋ฅ์ด ์ข์ง ์์ต๋๋ค (์: ํ๊ฒฝ ์ค์ ์ ๋ฏธ์ธ ์กฐ์ ํ ๊ณ ํ๊ฒฝ ์ค์ ์ ํ
์คํธ ํ๋ ๊ฒฝ์ฐ `askculinary` `askcarguys`).
4. **๋ ์ ์ ์ํญ์ ๋ํด ํ๋ จ** InstructGPT ์ข
์ด ํ์ดํผ๋ 1 ์ํญ์ ๋ํด์๋ง ๋ณด์ ๋ชจ๋ธ์ ํ๋ จํ๋ ๊ฒ์ ์ ์ํฉ๋๋ค.
๋์ผํ ์ฝ๋ฉํธ๊ฐ ์ฌ๋ฌ ์ ํธ๋์์ ๋ํ๋๊ธฐ ๋๋ฌธ์ ๋ฐ์ดํฐ์ ๊ณผ์ ํฉ๋๊ธฐ ์ฝ๋ค.
5. **๋ ์ ์ ๋ฐ์ดํฐ์ ๋ํ ๊ต์ก์ด ๋์์ด ๋ ์ ์์ต๋๋ค* *.
ํฐ `score_ratio`๋ฅผ ์ฌ์ฉํ๋ ํ๊ฒฝ ์ค์ (์: ์ฃผ์ B์ ์ ์๊ฐ 2๋ฐฐ์ธ ์ฃผ์ A)์ ๋ชจ๋ธ์ ์กฐ์ ํ๊ธฐ ์ํ ๋ ๊ฐ๋ ฅํ ์ ํธ๋ฅผ ์ ๊ณตํ๋ฏ๋ก ํน์ ์ด์์ ํ๊ฒฝ ์ค์ ๋ง ๊ณ ๋ คํ๋ ค๋ ๊ฒ์ผ ์ ์์ต๋๋ค `score_ratio`.
๊ฒ์๋ฌผ๋น ์ ํธ๋ ์๋ Pareto-distributed์ด๋ฏ๋ก ๋ชจ๋ธ์ด ํน์ ๊ฒ์๋ฌผ์ ๊ณผ๋ํ๊ฒ ์ ํฉ ํ๋ ๊ฒ์ ๋ฐฉ์ง ํ๊ธฐ ์ํด ํน์ ๊ฒ์๋ฌผ์์ ์ ํธ๋ ์๋ฅผ ์ ํ ํ๋ ๊ฒ์ด ์ข์ต๋๋ค.
### ํ๊ฐ
์ฝํ ๊ธฐ๋ณธ ์ค์ ๋ณด๋ค ๊ฐ๋ ฅํ ๊ธฐ๋ณธ ์ค์ ์ ์์ธกํ๋ ๊ฒ์ด ๋ ์ฝ๊ธฐ ๋๋ฌธ์ ๋จ์ผ ์ ํ๋ ๊ฐ์ ๋ณด๊ณ ํ๋ ๋์ ์ฑ๋ฅ ๊ณก์ ์ `score_ratio`์ ํจ์๋ก ๋ณด๊ณ ํ๋ ๊ฒ์ด ์ข์ต๋๋ค.
์๋ฅผ ๋ค์ด, ์ฌ๊ธฐ ์์ ์ ์๋ค์ ์ฌ์ฉํ์ฌ ์ง๋ฌธ์ ๋ฐ์ดํฐ์ ๋ํด ํธ๋ ์ด๋๋ FLAN-T5-xl ๋ชจ๋ธ์ ๋ํ ์ ํ๋ ๊ณก์ ์ด ์๋ค.
์ฃผํฉ์ ๋ผ์ธ์ 2+ ์ค์ฝ์ด ๋น์จ์ ๊ฐ๋ ์ ํธ๋์๋ง ํผ๋ํ๋ํ๊ณ ๊ณผ์ ํฉ์ ๋ฐฉ์งํ๊ธฐ ์ํด ๊ฐ ํฌ์คํธ๋ก๋ถํฐ 5๊ฐ ์ดํ์ ์ ํธ๋๋ฅผ ์ฌ์ฉํ๋ ๊ฒ์ด๋ค:
๋ก ์ด๋ฃจ์ด์ง ๊ตฐ์์ ์ ํ๋๋ ์ด๋ ํ๋์ธ ๊ฒ์ ํน์ง์ผ๋ก ํ๋ ์ ๊ธฐ ๋ฐ๊ด ํ์ ์ฅ์น. [๊ทธ๋ํ](curve.png)
์ฐ๋ฆฌ๋ ๋ ๋ฎ์ง๋ง ๋ ๋์ ํ์ง์ ๋ฐ์ดํฐ๋ฅผ ๋ฏธ์ธ ์กฐ์ ํ๋ ๊ฒ์ด ์ค์ ๋จ์ ์ด ์๋ ์ ์ ๋น์จ์ด 3.5 ๋ฏธ๋ง์ธ ํ
์คํธ ๋ฐ์ดํฐ์ ๋ํ ๋ ๋์ ์ ํ๋๋ก ์ด์ด์ง๋ค๋ ๊ฒ์ ์ ์ ์๋ค!
ํ ํฐ ์ ํ ๋ด์ ์
๋ ฅ์ด ๋ง์ง ์๋ ์๋ ๋ชจ๋ธ์์ ์ฒ๋ฆฌํ ๊ฒ์ผ๋ก ์์ํ ์ ์๊ธฐ ๋๋ฌธ์ ์คํ์์ ์ ์ธ๋์๋ค.
### SteamSHP - Open-Source Preference Model
์ฐ๋ฆฌ๋ SHP ๋ฐ์ดํฐ ์ธํธ์ Anthropic์ HH-RLHF์ ๋์ ๋ฐ์ดํฐ ๋ชจ๋์ ๋ํด ๋ ๊ฐ์ FLAN-T5 ๋ชจ๋ธ์ ๋ฏธ์ธ ์กฐ์ ํ๋ค. ๊ทธ๋ค์
- ํ
์คํธ ๋ฐ์ดํฐ์์ 72.8%๋ฅผ ๋ฌ์ฑํ๋ 3B ๋งค๊ฐ ๋ณ์ ๋ชจ๋ธ์ธ [SteamSHP-XL](https://huggingface.co/stanfordnlp/SteamSHP-flan-t5-xl)์
๋๋ค.
- ํ
์คํธ ๋ฐ์ดํฐ์์ 72.0%๋ฅผ ๋ฌ์ฑํ๋ 780M ๋งค๊ฐ ๋ณ์ ๋ชจ๋ธ์ธ [SteamSHP-Large](https://huggingface.co/stanfordnlp/SteamSHP-flan-t5-large)์
๋๋ค.
NLG ํ๊ฐ, RLHF์ ๋ํ ๋ณด์ ๋ชจ๋ธ ๊ตฌ์ถ ๋๋ ์ ํฉํ๋ค๊ณ ์๊ฐํ๋ ๋ค๋ฅธ ๋ชฉ์ ์ผ๋ก ์คํSHP๋ฅผ ์ฌ์ฉํ๋ ๊ฒ์ด ์ข์ต๋๋ค!
## ํธํฅ ๋ฐ ์ ํ ์ฌํญ
### Biases
NSFW(18์ธ ์ด์) ์ฝํ
์ธ ๋ก ๊ฒ์๋ฌผ์ ๊ฑธ๋ฌ๋ด๊ณ , ์ ์กฐ์ ๋๊ณ ๊ดด๋กญํ๊ณผ ํธํ์ ๋ํ ์ ์ฑ
์ด ์๋ ํ์ ๋ ๋ง์ ์ ํํ์ง๋ง ์ผ๋ถ ๋ฐ์ดํฐ์๋ ์ฐจ๋ณ์ ์ด๊ฑฐ๋ ํด๋ก์ด ์ธ์ด๊ฐ ํฌํจ๋ ์ ์๋ค.
๋ฐ์ดํฐ๋ ๋ฐ์ดํฐ ์ธํธ ์์ฑ์์ ๋ณด๊ธฐ๋ฅผ ๋ฐ์ํ์ง ์์ต๋๋ค.
์ด๋ฌํ ํ์ ๋ ๋ง์ ๋ ๋ง ์ฌ์ฉ์๋ ๊ด๋ฒ์ํ ๋ชจ์ง๋จ์ ๋ํํ์ง ์๋๋ค.
ํ์ ๋ ๋ง๋ณ ์ธ๊ตฌ ํต๊ณ ์ ๋ณด๋ ์ฌ์ฉํ ์ ์์ง๋ง ์ ์ฒด ๋ ๋ง ์ฌ์ฉ์๋ ๋ถ๊ท ํ์ ์ผ๋ก ๋จ์ฑ์ด๋ฉฐ ์ ์ง๊ตญ, ์์ ๋ฐ ์์ด ์ฌ์ฉ ๊ตญ๊ฐ์์ ์์ต๋๋ค ([Pew Research](https://www.pewresearch.org/internet/2013/07/03/6-of-online-adults-are-reddit-users/)).
์ด ๋ฐ์ดํฐ์ ๋ํด ํ์ต๋ ๋ชจ๋ธ์ ์ฌ์ฉํ๊ธฐ ์ ์ ์ด ์ ์ ์ผ๋์ ๋์ญ์์ค.
### ์ ํ ์ฌํญ
SHP์ ์ ํธ๋ ๋ ์ด๋ธ์ ์ง์/์ง๋ฌธ์ด ์ฃผ์ด์ก์ ๋ ํ ์๋ต์ด ๋ค๋ฅธ ์๋ต๊ณผ ์ผ๋ง๋ *๋์์ด* ๋๋์ง ๋ฐ์ ํ๊ธฐ ์ํ ๊ฒ์
๋๋ค.
SHP๋ ์ข์ ๋
์ฑ ๊ฒ์ถ๊ธฐ๋ฅผ ๋ฐฐ์ฐ๋ ๋ฐ ํ์ํ ๋
์ฑ ํจ๋์ ํฌํจํ๋๋ก ์ค๊ณ๋์ง ์์๊ธฐ ๋๋ฌธ์ ์ํด ์ต์ํ์์ ์ฌ์ฉํ๊ธฐ ์ํ ๊ฒ์ด ์๋๋ค.
ํ๊ฒฝ ์ค์ ๋ ์ด๋ธ์ด ๋ ์ ์ ํด๋ฅผ ๋ํ๋ด๋ ๋ฐ์ดํฐ๋ฅผ ์ฐพ๋ ๊ฒฝ์ฐ [Anthropic์ HH-RLHF](https://huggingface.co/datasets/Anthropic/hh-rlhf)์ ์ ํด์ฑ ๋ถํ ์ ๊ถ์ฅํฉ๋๋ค.
๋ ๋ค๋ฅธ ํ๊ณ๋ SHP์์ ์ ํธ๋๋ ์๋ต์ด ๋ฐ๋์ ๋ ์ฌ์ค์ ์ธ ์๋ต์ ์๋๋ผ๋ ๊ฒ์ด๋ค.
์ผ๋ถ ๋
ผํ์ ๊ทธ๋ค์ ๋ฐ์์ ์ ๋นํํ๊ธฐ ์ํด ์ธ์ฉ์ ์ ๊ณตํ์ง๋ง ๋๋ถ๋ถ์ ๊ทธ๋ ์ง ์๋ค.
์ฌ๊ธฐ์๋ `askhistorians` ํ์ ๋ ๋ง๊ณผ ๊ฐ์ ์์ธ๊ฐ ์์ผ๋ฉฐ, ์ด๋ ํฌ๊ฒ ์กฐ์ ๋๋ฉฐ ๋ต๋ณ์ด ์ธ์ฉ์ ์ ๊ณตํ ๊ฒ์ผ๋ก ์์๋๋ค.
SHP์ ์ง๋จ ์ ํธ๋ ๋ผ๋ฒจ์ ๊ฐ์ค์น๊ฐ ์๋ ํฉ๊ณ๋ฅผ ์ทจํ๊ธฐ ์ ์ ์ฌ์ฉ์์๊ฒ ๊ฐ ์ฝ๋ฉํธ์ ๋
๋ฆฝ์ ์ผ๋ก ํฌํํ๋๋ก ์์ฒญํ๋ฉด ๋ฐ๋์ ์ป์ ์ ์๋ ๊ฒ์ ์๋๋ค.
Reddit์ ๋ํ ์ฃผ์ ์ ์๋ ๊ณต๊ฐ์ ์ด๋ฉฐ ์ฌ์ฉ์ ํ๊ฒฝ ์ค์ ์ ์ํฅ์ ๋ฏธ์น๋ ๊ฒ์ผ๋ก ์๋ ค์ ธ ์๊ธฐ ๋๋ฌธ์
๋๋ค. ๋์ ์ ์๋ [(Muchnik et al., 2013)](https://pubmed.ncbi.nlm.nih.gov/23929980/)๋ณด๋ค ๊ธ์ ์ ์ธ ํ๋ฅผ ์ป์ ๊ฐ๋ฅ์ฑ์ ๋์
๋๋ค.
์ด "ํ๋ฉ ํจ๊ณผ"๊ฐ ์ฌ์ฉ์์ ์ ํธ๋๋ฅผ ์ผ์์ ์ผ๋ก ๋๋ ์๊ตฌ์ ์ผ๋ก ์ด๋์ํค๋์ง ์ฌ๋ถ๋ ๋ถ๋ถ๋ช
ํ๋ค.
๋ฐ๋ผ์, SHP๊ฐ ์ง๋จ์ ์ธ๊ฐ ์ ํธ๋๋ฅผ ๋ฐ์ํ์ง๋ง, SHP์ ๋ํด ํ๋ จ๋ ๋ชจ๋ธ์ ๊ฐ๋ณ ์ ํธ๋๊ฐ ๋ค๋ฅด๊ฒ ์ง๊ณ๋๋ ์ค์ ์ผ๋ก ์ผ๋ฐํ๋์ง ์์ ์ ์๋ค(์๋ฅผ ๋ค์ด, ์ฌ์ฉ์๋ ํ์ฌ ์ฝ๋ฉํธ ์ ์๋ฅผ ์ ํ ๋ณด์ง ์๊ณ ๋
๋ฆฝ์ ์ผ๋ก ํฌํํ๊ณ , ์ฌ์ฉ์๋ ๋ถ์ฌ ํ ํฌํ ๋ฑ).
๊ทธ๋ ์คํ ๋ค๋๊ฐ ์ง์ ํด์ค์ ๊ณ ๋ง์์
## License
Last updated: 03/01/2023
์ด ๋ฐ์ดํฐ ์ธํธ๋ Reddit๊ณผ ์ง์ ํต์ ๋๋ ์๋ฉด ๋์ ์์ด [Reddit API ์ฌ์ฉ ์ฝ๊ด](https://docs.google.com/a/reddit.com/forms/d/e/1FAIpQLSezNdDNK1-P8mspSbmtC2r86Ee9ZRbC66u929cG2GX0T9UMyw/viewform)์ ๋ฐ๋ผ Reddit์ ์คํฌ๋ํํ์ฌ ๋ง๋ค์์ต๋๋ค.
์ฌ์ฉ ์ฝ๊ด์ ๋ฐ๋ผ "์ฌ์ฉ์ ์ฝํ
์ธ "๋ Reddit์ด ์๋ ์ฌ์ฉ์ ์์ ์ด ์์ ํ๊ณ ์์ผ๋ฉฐ Reddit์ "์ฌ์ฉ์ ์ฝํ
์ธ ๋ฅผ ๋ณต์ฌ ๋ฐ ํ์ ํ๊ธฐ ์ํด ๋
์ ์ ์ด์ง ์๊ณ ์๋ํ ์ ์์ผ๋ฉฐ ๊ณต๊ฐ๋์ง ์์ผ๋ฉฐ ์ทจ์ํ ์ ์๋ ๋ผ์ด์ ์ค"๋ฅผ ๋ถ์ฌ ํฉ๋๋ค.
Reddit์ ์คํฌ๋ํ ํ ์ฌ ๋ง๋ ๋ฐ์ดํฐ ์งํฉ์ ์ฐ๊ตฌ ์ปค๋ฎค๋ํฐ์์ ๋๋ฆฌ ์ฌ์ฉ ๋ฉ๋๋ค. ์๋ฅผ ๋ค์ด Facebook AI ๋ฆฌ์์น๋ Reddit์์ ์คํฌ๋ํ ๋ ๋ฐ์ดํฐ๋ฅผ ์ฌ์ฉ ํ ์ฌ ๋ผ์ด์ ์ค ์์ด ์ฌ์ฉ ํ๋๋ก ๋ง๋ 2019๋
[ELI5](https://huggingface.co/datasets/eli5#source-data) ๋ฐ์ดํฐ ์งํฉ์ ๋ง๋ค์์ต๋๋ค.
์ธ๋ฅ์ฑ AI๋ ๋ค๋ฅธ ๋ฐฉ๋ฒ๋ก ์ ์ฌ์ฉ ํ ์ฌ ํ๊ฒฝ ์ค์ ์ ๋ ํ [Reddit์ ์คํฌ๋ํ](https://arxiv.org/pdf/2112.00861.pdf) ํฉ๋๋ค. ๊ทธ๋ฌ๋์ด ๋ฐ์ดํฐ๋ ๊ณต๊ฐ ๋์ง ์์์ต๋๋ค.
์ ๊ธฐ์ ์ธ ์ผ์ ์์ Reddit์ ์ ์ฒด ๋คํ๋ฅผ ์ฌ์ฉํ ์ ์๋๋ก ํ๋ [PushShift Reddit ๋ฐ์ดํฐ ์ธํธ](https://arxiv.org/abs/2001.08435)๋ ๋ผ์ด์ ์ค ์์ด ์ฌ์ฉํ ์ ์์ต๋๋ค (์๊ณ ์๋ ๋ฒ์).
์ฐ๋ฆฌ๋ ์ฑ
์์ ์ง์ง ์์ผ๋ฉฐ ์ด ๋ฐ์ดํฐ ์ธํธ์ ๋ค์ด์คํธ๋ฆผ ์ฌ์ฉ์ ๋ช
์์ ์ผ๋ก ๋๋ ์์์ ์ผ๋ก ์ง์งํ์ง ์๋๋ค.
์ฐ๋ฆฌ๋ ํฅํ ์ด๋ ์์ ์์๋ SHP ๋ฐ์ดํฐ ์ธํธ์ ์ด ๋ผ์ด์ ์ค๋ฅผ ์์ ํ ์ ์๋ ๊ถํ์ ๋ณด์ ํฉ๋๋ค.
## Contact
๋ฐ์ดํฐ์ ๋ํ ์ง๋ฌธ์ด ์๋ ๊ฒฝ์ฐ kawin@stanford.edu์ ๋ฌธ์ํ์ญ์์ค.
์ด ๋ฐ์ดํฐ ์ธํธ๋ ์นด์ ์ํ์ผ๋ผํ, ํ์ด๋(์ฒธ์ ) ์ฅ, ์ด์ค ์ ๋ฐ ๋จ ์ฃผ๋ผํ์คํค์ ์ํด ์์ฑ๋์๋ค.
## ์ธ์ฉ
SHP๋ ๋ค์ ๋
ผ๋ฌธ์์ ์ ์ํ ๊ธฐ๋ฒ์ ์ด์ฉํ์ฌ ์์ฑํ์๋ค. SHP ๋๋ ์คํSHP ๋ชจ๋ธ์ ์ฌ์ฉํ๋ ๊ฒฝ์ฐ ์ด ์์
์ ์ธ์ฉํ์ญ์์ค.
```
@InProceedings{pmlr-v162-ethayarajh22a,
title = {Understanding Dataset Difficulty with $\mathcal{V}$-Usable Information},
author = {Ethayarajh, Kawin and Choi, Yejin and Swayamdipta, Swabha},
booktitle = {Proceedings of the 39th International Conference on Machine Learning},
pages = {5988--6008},
year = {2022},
editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},
volume = {162},
series = {Proceedings of Machine Learning Research},
month = {17--23 Jul},
publisher = {PMLR},
}
```
## ์ฐธ์กฐ
Ethayarajh, K., Choi, Y. & Swayamdipta, S. (2022). Understanding Dataset Difficulty with $\mathcal{V}$-Usable Information. <i>Proceedings of the 39th International Conference on Machine Learning</i>, in <i>Proceedings of Machine Learning Research</i>. 162:5988-6008 Available from https://proceedings.mlr.press/v162/ethayarajh22a.html. |
liuyanchen1015/MULTI_VALUE_wnli_linking_relcl | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3249
num_examples: 17
- name: test
num_bytes: 2742
num_examples: 9
- name: train
num_bytes: 17600
num_examples: 92
download_size: 18391
dataset_size: 23591
---
# Dataset Card for "MULTI_VALUE_wnli_linking_relcl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deokhk/zh_wiki_sentences_1000000 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 127836004
num_examples: 1000000
- name: dev
num_bytes: 135625
num_examples: 1000
download_size: 88011343
dataset_size: 127971629
---
# Dataset Card for "zh_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Birchlabs/openai-prm800k-phase2_train-stepwise-best | ---
license: mit
---
|
open-llm-leaderboard/details_namirocks__student-model-13b-ep3 | ---
pretty_name: Evaluation run of namirocks/student-model-13b-ep3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [namirocks/student-model-13b-ep3](https://huggingface.co/namirocks/student-model-13b-ep3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_namirocks__student-model-13b-ep3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T01:37:20.077989](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__student-model-13b-ep3/blob/main/results_2023-12-30T01-37-20.077989.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5621545466705268,\n\
\ \"acc_stderr\": 0.03351539520737431,\n \"acc_norm\": 0.5727655950528345,\n\
\ \"acc_norm_stderr\": 0.03442395762278095,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871105,\n \"mc2\": 0.35003126952306707,\n\
\ \"mc2_stderr\": 0.014347219852780793\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.43856655290102387,\n \"acc_stderr\": 0.014500682618212865,\n\
\ \"acc_norm\": 0.46501706484641636,\n \"acc_norm_stderr\": 0.01457558392201966\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6061541525592511,\n\
\ \"acc_stderr\": 0.004876028037941937,\n \"acc_norm\": 0.8036247759410476,\n\
\ \"acc_norm_stderr\": 0.003964437012249992\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6483870967741936,\n \"acc_stderr\": 0.02716253782694846,\n \"\
acc_norm\": 0.6483870967741936,\n \"acc_norm_stderr\": 0.02716253782694846\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098617,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098617\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245265,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245265\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310233,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310233\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948496,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.0184152863514164,\n \"acc_norm\"\
: 0.7559633027522936,\n \"acc_norm_stderr\": 0.0184152863514164\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703642,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703642\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.026296227915613663,\n\
\ \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.026296227915613663\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.01578800719018588,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.01578800719018588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159617,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159617\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971635,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971635\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815247,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815247\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.02955545423677886,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.02955545423677886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.01262078515588599,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.01262078515588599\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.03035230339535196,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.03035230339535196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835542,\n \
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835542\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871105,\n \"mc2\": 0.35003126952306707,\n\
\ \"mc2_stderr\": 0.014347219852780793\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871596\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/namirocks/student-model-13b-ep3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|arc:challenge|25_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|gsm8k|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hellaswag|10_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T01-37-20.077989.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- '**/details_harness|winogrande|5_2023-12-30T01-37-20.077989.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T01-37-20.077989.parquet'
- config_name: results
data_files:
- split: 2023_12_30T01_37_20.077989
path:
- results_2023-12-30T01-37-20.077989.parquet
- split: latest
path:
- results_2023-12-30T01-37-20.077989.parquet
---
# Dataset Card for Evaluation run of namirocks/student-model-13b-ep3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [namirocks/student-model-13b-ep3](https://huggingface.co/namirocks/student-model-13b-ep3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_namirocks__student-model-13b-ep3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T01:37:20.077989](https://huggingface.co/datasets/open-llm-leaderboard/details_namirocks__student-model-13b-ep3/blob/main/results_2023-12-30T01-37-20.077989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5621545466705268,
"acc_stderr": 0.03351539520737431,
"acc_norm": 0.5727655950528345,
"acc_norm_stderr": 0.03442395762278095,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871105,
"mc2": 0.35003126952306707,
"mc2_stderr": 0.014347219852780793
},
"harness|arc:challenge|25": {
"acc": 0.43856655290102387,
"acc_stderr": 0.014500682618212865,
"acc_norm": 0.46501706484641636,
"acc_norm_stderr": 0.01457558392201966
},
"harness|hellaswag|10": {
"acc": 0.6061541525592511,
"acc_stderr": 0.004876028037941937,
"acc_norm": 0.8036247759410476,
"acc_norm_stderr": 0.003964437012249992
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098617,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245265,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245265
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310233,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310233
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948496,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.0184152863514164,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.0184152863514164
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703642,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703642
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.026296227915613663,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.026296227915613663
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.01578800719018588,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.01578800719018588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159617,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159617
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971635,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971635
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815247,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815247
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.02955545423677886,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.02955545423677886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.01262078515588599,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.01262078515588599
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.019955975145835542,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.019955975145835542
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871105,
"mc2": 0.35003126952306707,
"mc2_stderr": 0.014347219852780793
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871596
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/squad_wrong_title_v3_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 276687
num_examples: 184
- name: validation
num_bytes: 64754
num_examples: 68
download_size: 71442
dataset_size: 341441
---
# Dataset Card for "squad_wrong_title_v3_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
divers/requirement-question | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: job_requirement
dtype: string
- name: questions
dtype: string
splits:
- name: train
num_bytes: 35480682
num_examples: 23237
download_size: 4168927
dataset_size: 35480682
---
# Dataset Card for "requirement-question"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kaue123456/JoaoGriloMatheusNachtergaele | ---
license: openrail
---
|
arthurmluz/cstnews_data-temario_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 69932
num_examples: 16
download_size: 0
dataset_size: 69932
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "cstnews_data-temario_results"
rouge= {'rouge1': 0.5207584715132082, 'rouge2': 0.34711381882009107, 'rougeL': 0.38095639884621346, 'rougeLsum': 0.38095639884621346}
bert= {'precision': 0.7428307943046093, 'recall': 0.8364794515073299, 'f1': 0.7866528294980526}
mover = 0.6287250343090405 |
open-llm-leaderboard/details_OpenAssistant__stablelm-7b-sft-v7-epoch-3 | ---
pretty_name: Evaluation run of OpenAssistant/stablelm-7b-sft-v7-epoch-3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenAssistant/stablelm-7b-sft-v7-epoch-3](https://huggingface.co/OpenAssistant/stablelm-7b-sft-v7-epoch-3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__stablelm-7b-sft-v7-epoch-3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T03:23:25.661445](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__stablelm-7b-sft-v7-epoch-3/blob/main/results_2023-10-13T03-23-25.661445.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05578859060402685,\n\
\ \"em_stderr\": 0.0023504280872280073,\n \"f1\": 0.10613569630872476,\n\
\ \"f1_stderr\": 0.0026144580255279513,\n \"acc\": 0.27616530425036784,\n\
\ \"acc_stderr\": 0.007839405520583978\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.05578859060402685,\n \"em_stderr\": 0.0023504280872280073,\n\
\ \"f1\": 0.10613569630872476,\n \"f1_stderr\": 0.0026144580255279513\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501943\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5485398579321231,\n \"acc_stderr\": 0.01398611030101776\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenAssistant/stablelm-7b-sft-v7-epoch-3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T03_23_25.661445
path:
- '**/details_harness|drop|3_2023-10-13T03-23-25.661445.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T03-23-25.661445.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T03_23_25.661445
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-23-25.661445.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T03-23-25.661445.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:06:42.731727.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:07:54.588127.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:06:42.731727.parquet'
- split: 2023_07_19T17_07_54.588127
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:07:54.588127.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:07:54.588127.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T03_23_25.661445
path:
- '**/details_harness|winogrande|5_2023-10-13T03-23-25.661445.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T03-23-25.661445.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_06_42.731727
path:
- results_2023-07-19T17:06:42.731727.parquet
- split: 2023_07_19T17_07_54.588127
path:
- results_2023-07-19T17:07:54.588127.parquet
- split: 2023_10_13T03_23_25.661445
path:
- results_2023-10-13T03-23-25.661445.parquet
- split: latest
path:
- results_2023-10-13T03-23-25.661445.parquet
---
# Dataset Card for Evaluation run of OpenAssistant/stablelm-7b-sft-v7-epoch-3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/stablelm-7b-sft-v7-epoch-3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/stablelm-7b-sft-v7-epoch-3](https://huggingface.co/OpenAssistant/stablelm-7b-sft-v7-epoch-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__stablelm-7b-sft-v7-epoch-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T03:23:25.661445](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__stablelm-7b-sft-v7-epoch-3/blob/main/results_2023-10-13T03-23-25.661445.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05578859060402685,
"em_stderr": 0.0023504280872280073,
"f1": 0.10613569630872476,
"f1_stderr": 0.0026144580255279513,
"acc": 0.27616530425036784,
"acc_stderr": 0.007839405520583978
},
"harness|drop|3": {
"em": 0.05578859060402685,
"em_stderr": 0.0023504280872280073,
"f1": 0.10613569630872476,
"f1_stderr": 0.0026144580255279513
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501943
},
"harness|winogrande|5": {
"acc": 0.5485398579321231,
"acc_stderr": 0.01398611030101776
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
atmallen/neg_companies_azaria_mitchell | ---
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 75670.4
num_examples: 880
- name: test
num_bytes: 18917.6
num_examples: 220
download_size: 29413
dataset_size: 94588.0
---
# Dataset Card for "neg_companies_azaria_mitchell"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/shp-generated_flan_t5_large_with_features | ---
dataset_info:
features:
- name: response
dtype: string
- name: prompt
dtype: string
- name: helpfulness
dtype: int64
- name: specificity
dtype: int64
- name: intent
dtype: int64
- name: factuality
dtype: int64
- name: easy-to-understand
dtype: int64
- name: relevance
dtype: int64
- name: readability
dtype: int64
- name: enough-detail
dtype: int64
- name: 'biased:'
dtype: int64
- name: fail-to-consider-individual-preferences
dtype: int64
- name: repetetive
dtype: int64
- name: fail-to-consider-context
dtype: int64
- name: too-long
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1736538
num_examples: 1500
download_size: 215337
dataset_size: 1736538
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shp-generated_flan_t5_large_with_features"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yyh0901__lloma_step200 | ---
pretty_name: Evaluation run of yyh0901/lloma_step200
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yyh0901/lloma_step200](https://huggingface.co/yyh0901/lloma_step200) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yyh0901__lloma_step200\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T13:05:40.214845](https://huggingface.co/datasets/open-llm-leaderboard/details_yyh0901__lloma_step200/blob/main/results_2024-04-06T13-05-40.214845.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4370638206369057,\n\
\ \"acc_stderr\": 0.034251250239833685,\n \"acc_norm\": 0.44330275084620024,\n\
\ \"acc_norm_stderr\": 0.035090552757218396,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.398074913504716,\n\
\ \"mc2_stderr\": 0.01370017096726305\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46245733788395904,\n \"acc_stderr\": 0.01457014449507558,\n\
\ \"acc_norm\": 0.5068259385665529,\n \"acc_norm_stderr\": 0.014610029151379813\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.571400119498108,\n\
\ \"acc_stderr\": 0.004938643787869547,\n \"acc_norm\": 0.7714598685520813,\n\
\ \"acc_norm_stderr\": 0.004190341541141985\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270658,\n\
\ \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270658\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n\
\ \"acc_stderr\": 0.03669072477416908,\n \"acc_norm\": 0.36416184971098264,\n\
\ \"acc_norm_stderr\": 0.03669072477416908\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4645161290322581,\n\
\ \"acc_stderr\": 0.028372287797962956,\n \"acc_norm\": 0.4645161290322581,\n\
\ \"acc_norm_stderr\": 0.028372287797962956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970187,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970187\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5212121212121212,\n \"acc_stderr\": 0.03900828913737302,\n\
\ \"acc_norm\": 0.5212121212121212,\n \"acc_norm_stderr\": 0.03900828913737302\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.03559443565563919,\n \"\
acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.03559443565563919\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6110091743119266,\n \"acc_stderr\": 0.020902300887392873,\n \"\
acc_norm\": 0.6110091743119266,\n \"acc_norm_stderr\": 0.020902300887392873\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5245098039215687,\n \"acc_stderr\": 0.03505093194348798,\n \"\
acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.03505093194348798\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \
\ \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5201793721973094,\n\
\ \"acc_stderr\": 0.033530461674123005,\n \"acc_norm\": 0.5201793721973094,\n\
\ \"acc_norm_stderr\": 0.033530461674123005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.047504583990416946,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.047504583990416946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6623931623931624,\n\
\ \"acc_stderr\": 0.030980296992618558,\n \"acc_norm\": 0.6623931623931624,\n\
\ \"acc_norm_stderr\": 0.030980296992618558\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5887611749680716,\n\
\ \"acc_stderr\": 0.01759597190805657,\n \"acc_norm\": 0.5887611749680716,\n\
\ \"acc_norm_stderr\": 0.01759597190805657\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.026915047355369804,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.026915047355369804\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.028332397483664274,\n\
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.028332397483664274\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5401929260450161,\n\
\ \"acc_stderr\": 0.028306190403305696,\n \"acc_norm\": 0.5401929260450161,\n\
\ \"acc_norm_stderr\": 0.028306190403305696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3539765319426336,\n\
\ \"acc_stderr\": 0.01221350473173164,\n \"acc_norm\": 0.3539765319426336,\n\
\ \"acc_norm_stderr\": 0.01221350473173164\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4215686274509804,\n \"acc_stderr\": 0.019977422600227467,\n \
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.019977422600227467\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4204081632653061,\n \"acc_stderr\": 0.03160106993449604,\n\
\ \"acc_norm\": 0.4204081632653061,\n \"acc_norm_stderr\": 0.03160106993449604\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n\
\ \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n\
\ \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.036459813773888065,\n\
\ \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.036459813773888065\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931578,\n \"mc2\": 0.398074913504716,\n\
\ \"mc2_stderr\": 0.01370017096726305\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7087608524072613,\n \"acc_stderr\": 0.012769029305370697\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04169825625473844,\n \
\ \"acc_stderr\": 0.005506205058175746\n }\n}\n```"
repo_url: https://huggingface.co/yyh0901/lloma_step200
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|arc:challenge|25_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|gsm8k|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hellaswag|10_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-05-40.214845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T13-05-40.214845.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- '**/details_harness|winogrande|5_2024-04-06T13-05-40.214845.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T13-05-40.214845.parquet'
- config_name: results
data_files:
- split: 2024_04_06T13_05_40.214845
path:
- results_2024-04-06T13-05-40.214845.parquet
- split: latest
path:
- results_2024-04-06T13-05-40.214845.parquet
---
# Dataset Card for Evaluation run of yyh0901/lloma_step200
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yyh0901/lloma_step200](https://huggingface.co/yyh0901/lloma_step200) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yyh0901__lloma_step200",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T13:05:40.214845](https://huggingface.co/datasets/open-llm-leaderboard/details_yyh0901__lloma_step200/blob/main/results_2024-04-06T13-05-40.214845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4370638206369057,
"acc_stderr": 0.034251250239833685,
"acc_norm": 0.44330275084620024,
"acc_norm_stderr": 0.035090552757218396,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.398074913504716,
"mc2_stderr": 0.01370017096726305
},
"harness|arc:challenge|25": {
"acc": 0.46245733788395904,
"acc_stderr": 0.01457014449507558,
"acc_norm": 0.5068259385665529,
"acc_norm_stderr": 0.014610029151379813
},
"harness|hellaswag|10": {
"acc": 0.571400119498108,
"acc_stderr": 0.004938643787869547,
"acc_norm": 0.7714598685520813,
"acc_norm_stderr": 0.004190341541141985
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416908,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416908
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.4,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4645161290322581,
"acc_stderr": 0.028372287797962956,
"acc_norm": 0.4645161290322581,
"acc_norm_stderr": 0.028372287797962956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970187,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970187
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5212121212121212,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.5212121212121212,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.03559443565563919,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.03559443565563919
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6110091743119266,
"acc_stderr": 0.020902300887392873,
"acc_norm": 0.6110091743119266,
"acc_norm_stderr": 0.020902300887392873
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.03505093194348798,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.03505093194348798
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5201793721973094,
"acc_stderr": 0.033530461674123005,
"acc_norm": 0.5201793721973094,
"acc_norm_stderr": 0.033530461674123005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6623931623931624,
"acc_stderr": 0.030980296992618558,
"acc_norm": 0.6623931623931624,
"acc_norm_stderr": 0.030980296992618558
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5887611749680716,
"acc_stderr": 0.01759597190805657,
"acc_norm": 0.5887611749680716,
"acc_norm_stderr": 0.01759597190805657
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.026915047355369804,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.026915047355369804
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5401929260450161,
"acc_stderr": 0.028306190403305696,
"acc_norm": 0.5401929260450161,
"acc_norm_stderr": 0.028306190403305696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3539765319426336,
"acc_stderr": 0.01221350473173164,
"acc_norm": 0.3539765319426336,
"acc_norm_stderr": 0.01221350473173164
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.019977422600227467,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.019977422600227467
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4204081632653061,
"acc_stderr": 0.03160106993449604,
"acc_norm": 0.4204081632653061,
"acc_norm_stderr": 0.03160106993449604
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6549707602339181,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.6549707602339181,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931578,
"mc2": 0.398074913504716,
"mc2_stderr": 0.01370017096726305
},
"harness|winogrande|5": {
"acc": 0.7087608524072613,
"acc_stderr": 0.012769029305370697
},
"harness|gsm8k|5": {
"acc": 0.04169825625473844,
"acc_stderr": 0.005506205058175746
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tvergho/audio-diffusion-512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 896860831.5
num_examples: 6964
download_size: 895892605
dataset_size: 896860831.5
---
# Dataset Card for "audio-diffusion-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KnutJaegersberg/Deita-6k | ---
license: mit
---
English subset of the data. |
rai-sandeep/dataset_full_v3 | ---
dataset_info:
features:
- name: doctype
dtype: string
- name: section
dtype: string
- name: topic
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 27897
num_examples: 26
download_size: 19998
dataset_size: 27897
---
# Dataset Card for "dataset_full_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jamespratama/tutorial-platypus-llamma | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4251526
num_examples: 1000
download_size: 2253085
dataset_size: 4251526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lilacai/lilac-HellaSwag | ---
tags:
- Lilac
---
# lilac/HellaSwag
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/Rowan/hellaswag](https://huggingface.co/datasets/Rowan/hellaswag)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-HellaSwag
```
or from python with:
```py
ll.download("lilacai/lilac-HellaSwag")
```
|
Elfsong/patient_info | ---
configs:
- config_name: default
data_files:
- split: anxiety
path: data/anxiety-*
- split: depression
path: data/depression-*
- split: ptsd
path: data/ptsd-*
- split: bipolar
path: data/bipolar-*
- split: substance_misuse
path: data/substance_misuse-*
- split: eating_disorders
path: data/eating_disorders-*
- split: alcohol_consumption
path: data/alcohol_consumption-*
dataset_info:
features:
- name: url
dtype: string
- name: comments
list:
- name: author_from
sequence: string
- name: author_to
sequence: string
- name: comments
list:
- name: author_from
sequence: string
- name: author_to
sequence: string
- name: content
sequence: string
- name: date
sequence: string
- name: content
sequence: string
- name: date
sequence: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: content
dtype: string
- name: author
dtype: string
splits:
- name: anxiety
num_bytes: 143006120
num_examples: 27393
- name: depression
num_bytes: 49953142
num_examples: 6982
- name: ptsd
num_bytes: 1626957
num_examples: 349
- name: bipolar
num_bytes: 3087512
num_examples: 474
- name: substance_misuse
num_bytes: 1406369
num_examples: 195
- name: eating_disorders
num_bytes: 1294592
num_examples: 233
- name: alcohol_consumption
num_bytes: 21540333
num_examples: 1855
download_size: 109169290
dataset_size: 221915025
---
# Dataset Card for "patient_info"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sid-th26/prelims_all_questions | ---
dataset_info:
features:
- name: section_name
dtype: string
- name: sub_section_name
dtype: string
- name: topic_name
dtype: string
- name: Question
dtype: string
- name: Option_A
dtype: string
- name: Option_B
dtype: string
- name: Option_C
dtype: string
- name: Option_D
dtype: string
- name: explanation
dtype: string
- name: difficulty
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 229012652
num_examples: 92171
download_size: 109175296
dataset_size: 229012652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nuprl/MultiPL-E-synthetic-solutions | ---
dataset_info:
features:
- name: name
dtype: string
- name: language
dtype: string
- name: prompt
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 2185285
num_examples: 2624
download_size: 891673
dataset_size: 2185285
license: openrail
language:
- en
pretty_name: MultiPL-E Synthetic Solutions
---
# Dataset Card
This is a dataset of partial solutions to the HumanEval and MBPP code generation benchmarks tranlated into 18+
programming languages. The original benchmark problems were in Python, and we build the dataset as follows:
1. We translate the prompts into a new language using MultiPL-E;
2. We use code-davinci-002 to generate 200 completions for each problem at temperature 0.8;
3. We select a working solution (if one exists) for each problem-language pair.
[This notebook](https://github.com/nuprl/MultiPL-E/blob/main/notebooks/build_synthetic_solutions_dataset.ipynb)
carried out the steps described above.
Note that the dataset does *not* have solutions for every problem-language pair, since code-davinci-002 cannot
produce a correct solution to every problem. |
Joedwoo/NTT_version_1.0 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 3595
num_examples: 13
download_size: 4937
dataset_size: 3595
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cmarvolo/auto_fl | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- not-for-all-audiences
size_categories:
- n<1K
--- |
insanemyrr/test-diploma-lucchi-cropped-new-mix-biggest | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': testing
'1': training
splits:
- name: train
num_bytes: 299779032.96
num_examples: 3960
- name: test
num_bytes: 299751233.76
num_examples: 3960
download_size: 599433953
dataset_size: 599530266.72
---
# Dataset Card for "test-diploma-lucchi-cropped-new-mix-biggest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-28000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 664600
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
polinaeterna/push_to_hub_many_configs | ---
builder_configs:
- config_name: custom
data_files:
- split: train
pattern: custom/train-*
- split: random
pattern: custom/random-*
- config_name: default
data_files:
- split: train
pattern: data/train-*
- split: random
pattern: data/random-*
dataset_info:
- config_name: custom
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 160
num_examples: 10
download_size: 3650
dataset_size: 1760
- config_name: default
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1600
num_examples: 100
- name: random
num_bytes: 800
num_examples: 50
download_size: 4042
dataset_size: 2400
---
# Dataset Card for "push_to_hub_many_configs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/oosaki_amana_theidolmstershinycolors | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of oosaki_amana/ๅคงๅด็ๅฅ (THE iDOLM@STER: SHINY COLORS)
This is the dataset of oosaki_amana/ๅคงๅด็ๅฅ (THE iDOLM@STER: SHINY COLORS), containing 500 images and their tags.
The core tags of this character are `long_hair, bangs, brown_hair, yellow_eyes, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 978.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oosaki_amana_theidolmstershinycolors/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 475.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oosaki_amana_theidolmstershinycolors/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1311 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/oosaki_amana_theidolmstershinycolors/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 815.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oosaki_amana_theidolmstershinycolors/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1311 | 1.66 GiB | [Download](https://huggingface.co/datasets/CyberHarem/oosaki_amana_theidolmstershinycolors/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/oosaki_amana_theidolmstershinycolors',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blush, cleavage, collarbone, looking_at_viewer, solo, bare_shoulders, necklace, long_sleeves, off-shoulder_sweater, sitting, closed_mouth, double_bun, dress, earrings, smile, swept_bangs |
| 1 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, smile, collarbone, white_background, closed_mouth, simple_background, upper_body, bare_shoulders, brown_eyes, cleavage, choker, heart, sleeveless, white_dress |
| 2 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, necktie, plaid_skirt, school_uniform, solo, pleated_skirt, blush, open_mouth, :d, long_sleeves, white_shirt, braid, simple_background, sweater, blazer, collared_shirt, outdoors, petals, red_hair, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, collarbone, eyewear_on_head, heart-shaped_eyewear, navel, red_bikini, sunglasses, black_choker, floral_print, looking_at_viewer, simple_background, solo, swept_bangs, blush, bracelet, cleavage, earrings, necklace, one_eye_closed, open_mouth, white_background, :d, bare_shoulders, groin, thighs |
| 4 | 6 |  |  |  |  |  | 1girl, blush, gloves, looking_at_viewer, solo, white_coat, braid, open_mouth, smile, fur_hat, long_sleeves, snowing, white_headwear, winter_clothes, fur_trim, hat_bow, upper_body |
| 5 | 5 |  |  |  |  |  | 1girl, blue_sky, blush, day, looking_at_viewer, navel, outdoors, solo, armpits, arms_up, cloud, midriff, arms_behind_head, cowboy_shot, hairband, :d, bikini_under_clothes, blue_shorts, brown_eyes, collarbone, denim_shorts, frills, hair_bow, open_mouth, ribbon, short_shorts, swept_bangs, tied_shirt |
| 6 | 10 |  |  |  |  |  | 1girl, blush, looking_at_viewer, hair_ornament, obi, print_kimono, solo, floral_print, side_ponytail, sidelocks, hair_between_eyes, open_mouth, pink_kimono, wide_sleeves, :d, long_sleeves, yukata, blurry, swept_bangs, upper_body |
| 7 | 5 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, race_queen, solo, thigh_boots, thighhighs, black_choker, cleavage, hair_ribbon, holding_umbrella, navel, red_hair, belt, black_skirt, high_ponytail, miniskirt, open_mouth, smile, standing, swept_bangs, wrist_cuffs, bare_shoulders, black_footwear, full_body, ground_vehicle, hoop_earrings, midriff, mismatched_footwear, mismatched_legwear, motor_vehicle, nail_polish, official_alternate_costume, outdoors, sidelocks, sleeveless, tube_top |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | cleavage | collarbone | looking_at_viewer | solo | bare_shoulders | necklace | long_sleeves | off-shoulder_sweater | sitting | closed_mouth | double_bun | dress | earrings | smile | swept_bangs | white_background | simple_background | upper_body | brown_eyes | choker | heart | sleeveless | white_dress | necktie | plaid_skirt | school_uniform | pleated_skirt | open_mouth | :d | white_shirt | braid | sweater | blazer | collared_shirt | outdoors | petals | red_hair | eyewear_on_head | heart-shaped_eyewear | navel | red_bikini | sunglasses | black_choker | floral_print | bracelet | one_eye_closed | groin | thighs | gloves | white_coat | fur_hat | snowing | white_headwear | winter_clothes | fur_trim | hat_bow | blue_sky | day | armpits | arms_up | cloud | midriff | arms_behind_head | cowboy_shot | hairband | bikini_under_clothes | blue_shorts | denim_shorts | frills | hair_bow | ribbon | short_shorts | tied_shirt | hair_ornament | obi | print_kimono | side_ponytail | sidelocks | hair_between_eyes | pink_kimono | wide_sleeves | yukata | blurry | race_queen | thigh_boots | thighhighs | hair_ribbon | holding_umbrella | belt | black_skirt | high_ponytail | miniskirt | standing | wrist_cuffs | black_footwear | full_body | ground_vehicle | hoop_earrings | mismatched_footwear | mismatched_legwear | motor_vehicle | nail_polish | official_alternate_costume | tube_top |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------|:-------------|:--------------------|:-------|:-----------------|:-----------|:---------------|:-----------------------|:----------|:---------------|:-------------|:--------|:-----------|:--------|:--------------|:-------------------|:--------------------|:-------------|:-------------|:---------|:--------|:-------------|:--------------|:----------|:--------------|:-----------------|:----------------|:-------------|:-----|:--------------|:--------|:----------|:---------|:-----------------|:-----------|:---------|:-----------|:------------------|:-----------------------|:--------|:-------------|:-------------|:---------------|:---------------|:-----------|:-----------------|:--------|:---------|:---------|:-------------|:----------|:----------|:-----------------|:-----------------|:-----------|:----------|:-----------|:------|:----------|:----------|:--------|:----------|:-------------------|:--------------|:-----------|:-----------------------|:--------------|:---------------|:---------|:-----------|:---------|:---------------|:-------------|:----------------|:------|:---------------|:----------------|:------------|:--------------------|:--------------|:---------------|:---------|:---------|:-------------|:--------------|:-------------|:--------------|:-------------------|:-------|:--------------|:----------------|:------------|:-----------|:--------------|:-----------------|:------------|:-----------------|:----------------|:----------------------|:---------------------|:----------------|:--------------|:-----------------------------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | X | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | | | X | X | | | X | | | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | X | | X | X | X | | | | | | | | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | | X | X | | | X | | | | | | | X | | | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | X | X | X | | | | | | | | | | | X | | | | X | | | | | | | | | X | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | X | | | X | X | | | X | | | | | | | | X | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | X | X | | | | | | | X | | | | | | X | | | | | | | X | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
polinaeterna/old_push | ---
dataset_info:
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 800
num_examples: 50
download_size: 1763
dataset_size: 800
---
# Dataset Card for "old_push"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-lener_br-lener_br-2a71c5-1777061680 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: pierreguillou/ner-bert-base-cased-pt-lenerbr
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: validation
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: pierreguillou/ner-bert-base-cased-pt-lenerbr
* Dataset: lener_br
* Config: lener_br
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
Orange/paraqa-sparqltotext | ---
dataset_info:
features:
- name: uid
dtype: string
- name: query
dtype: string
- name: question
dtype: string
- name: simplified_query
dtype: string
- name: answer
dtype: string
- name: verbalized_answer
dtype: string
- name: verbalized_answer_2
dtype: string
- name: verbalized_answer_3
dtype: string
- name: verbalized_answer_4
dtype: string
- name: verbalized_answer_5
dtype: string
- name: verbalized_answer_6
dtype: string
- name: verbalized_answer_7
dtype: string
- name: verbalized_answer_8
dtype: string
splits:
- name: train
num_bytes: 2540548
num_examples: 3500
- name: validation
num_bytes: 369571
num_examples: 500
- name: test
num_bytes: 722302
num_examples: 1000
download_size: 1750172
dataset_size: 3632421
task_categories:
- conversational
- question-answering
- text-generation
- text2text-generation
tags:
- qa
- knowledge-graph
- sparql
---
# Dataset Card for ParaQA-SPARQLtoText
## Table of Contents
- [Dataset Card for ParaQA-SPARQLtoText](#dataset-card-for-paraqa-sparqltotext)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [New field `simplified_query`](#new-field-simplified_query)
- [New split "valid"](#new-split-valid)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Types of questions](#types-of-questions)
- [Data splits](#data-splits)
- [Additional information](#additional-information)
- [Related datasets](#related-datasets)
- [Licencing information](#licencing-information)
- [Citation information](#citation-information)
- [This version of the corpus (with normalized SPARQL queries)](#this-version-of-the-corpus-with-normalized-sparql-queries)
- [Original version](#original-version)
## Dataset Description
- **Paper:** [SPARQL-to-Text Question Generation for Knowledge-Based Conversational Applications (AACL-IJCNLP 2022)](https://aclanthology.org/2022.aacl-main.11/)
- **Point of Contact:** Gwรฉnolรฉ Lecorvรฉ
### Dataset Summary
Special version of ParaQA with SPARQL queries formatted for the SPARQL-to-Text task
#### New field `simplified_query`
New field is named "simplified_query". It results from applying the following step on the field "query":
* Replacing URIs with a simpler format with prefix "resource:", "property:" and "ontology:".
* Spacing the delimiters `(`, `{`, `.`, `}`, `)`.
* Randomizing the variables names
* Shuffling the clauses
#### New split "valid"
A validation set was randonly extracted from the test set to represent 10% of the whole dataset.
### Languages
- English
## Dataset Structure
### Types of questions
Comparison of question types compared to related datasets:
| | | [SimpleQuestions](https://huggingface.co/datasets/OrangeInnov/simplequestions-sparqltotext) | [ParaQA](https://huggingface.co/datasets/OrangeInnov/paraqa-sparqltotext) | [LC-QuAD 2.0](https://huggingface.co/datasets/OrangeInnov/lcquad_2.0-sparqltotext) | [CSQA](https://huggingface.co/datasets/OrangeInnov/csqa-sparqltotext) | [WebNLQ-QA](https://huggingface.co/datasets/OrangeInnov/webnlg-qa) |
|--------------------------|-----------------|:---------------:|:------:|:-----------:|:----:|:---------:|
| **Number of triplets in query** | 1 | โ | โ | โ | โ | โ |
| | 2 | | โ | โ | โ | โ |
| | More | | | โ | โ | โ |
| **Logical connector between triplets** | Conjunction | โ | โ | โ | โ | โ |
| | Disjunction | | | | โ | โ |
| | Exclusion | | | | โ | โ |
| **Topology of the query graph** | Direct | โ | โ | โ | โ | โ |
| | Sibling | | โ | โ | โ | โ |
| | Chain | | โ | โ | โ | โ |
| | Mixed | | | โ | | โ |
| | Other | | โ | โ | โ | โ |
| **Variable typing in the query** | None | โ | โ | โ | โ | โ |
| | Target variable | | โ | โ | โ | โ |
| | Internal variable | | โ | โ | โ | โ |
| **Comparisons clauses** | None | โ | โ | โ | โ | โ |
| | String | | | โ | | โ |
| | Number | | | โ | โ | โ |
| | Date | | | โ | | โ |
| **Superlative clauses** | No | โ | โ | โ | โ | โ |
| | Yes | | | | โ | |
| **Answer type** | Entity (open) | โ | โ | โ | โ | โ |
| | Entity (closed) | | | | โ | โ |
| | Number | | | โ | โ | โ |
| | Boolean | | โ | โ | โ | โ |
| **Answer cardinality** | 0 (unanswerable) | | | โ | | โ |
| | 1 | โ | โ | โ | โ | โ |
| | More | | โ | โ | โ | โ |
| **Number of target variables** | 0 (โ ASK verb) | | โ | โ | โ | โ |
| | 1 | โ | โ | โ | โ | โ |
| | 2 | | | โ | | โ |
| **Dialogue context** | Self-sufficient | โ | โ | โ | โ | โ |
| | Coreference | | | | โ | โ |
| | Ellipsis | | | | โ | โ |
| **Meaning** | Meaningful | โ | โ | โ | โ | โ |
| | Non-sense | | | | | โ |
### Data splits
Text verbalization is only available for a subset of the test set, referred to as *challenge set*. Other sample only contain dialogues in the form of follow-up sparql queries.
| | Train | Validation | Test |
| --------------------- | ---------- | ---------- | ---------- |
| Questions | 3,500 | 500 | 1,000 |
| NL question per query | 1 |
| Characters per query | 103 (ยฑ 27) |
| Tokens per question | 10.3 (ยฑ 3.7) |
## Additional information
### Related datasets
This corpus is part of a set of 5 datasets released for SPARQL-to-Text generation, namely:
- Non conversational datasets
- [SimpleQuestions](https://huggingface.co/datasets/OrangeInnov/simplequestions-sparqltotext) (from https://github.com/askplatypus/wikidata-simplequestions)
- [ParaQA](https://huggingface.co/datasets/OrangeInnov/paraqa-sparqltotext) (from https://github.com/barshana-banerjee/ParaQA)
- [LC-QuAD 2.0](https://huggingface.co/datasets/OrangeInnov/lcquad_2.0-sparqltotext) (from http://lc-quad.sda.tech/)
- Conversational datasets
- [CSQA](https://huggingface.co/datasets/OrangeInnov/csqa-sparqltotext) (from https://amritasaha1812.github.io/CSQA/)
- [WebNLQ-QA](https://huggingface.co/datasets/OrangeInnov/webnlg-qa) (derived from https://gitlab.com/shimorina/webnlg-dataset/-/tree/master/release_v3.0)
### Licencing information
* Content from original dataset: CC-BY 4.0
* New content: CC BY-SA 4.0
### Citation information
#### This version of the corpus (with normalized SPARQL queries)
```bibtex
@inproceedings{lecorve2022sparql2text,
title={SPARQL-to-Text Question Generation for Knowledge-Based Conversational Applications},
author={Lecorv\'e, Gw\'enol\'e and Veyret, Morgan and Brabant, Quentin and Rojas-Barahona, Lina M.},
journal={Proceedings of the Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the International Joint Conference on Natural Language Processing (AACL-IJCNLP)},
year={2022}
}
```
#### Original version
```bibtex
@inproceedings{kacupaj2021paraqa,
title={Paraqa: a question answering dataset with paraphrase responses for single-turn conversation},
author={Kacupaj, Endri and Banerjee, Barshana and Singh, Kuldeep and Lehmann, Jens},
booktitle={European semantic web conference},
pages={598--613},
year={2021},
organization={Springer}
}
```
|
CyberHarem/sumi_otokawa_sakuratrick | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Sumi Otokawa
This is the dataset of Sumi Otokawa, containing 49 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 49 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 117 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 145 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 49 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 49 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 49 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 117 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 117 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 101 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 145 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 145 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b | ---
pretty_name: Evaluation run of cognitivecomputations/MegaDolphin-120b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/MegaDolphin-120b](https://huggingface.co/cognitivecomputations/MegaDolphin-120b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T21:48:58.549252](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b/blob/main/results_2024-01-21T21-48-58.549252.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6902282983987049,\n\
\ \"acc_stderr\": 0.03061680900457877,\n \"acc_norm\": 0.6956826713561578,\n\
\ \"acc_norm_stderr\": 0.03120808276540501,\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.017240861812099804,\n \"mc2\": 0.592821117756168,\n\
\ \"mc2_stderr\": 0.015249093012153285\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620196,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238361\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7026488747261501,\n\
\ \"acc_stderr\": 0.004561582009834577,\n \"acc_norm\": 0.8780123481378211,\n\
\ \"acc_norm_stderr\": 0.0032660269509226444\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.0271342916287417,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.0271342916287417\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802268,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802268\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388525,\n \"\
acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388525\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8290322580645161,\n \"acc_stderr\": 0.02141724293632158,\n \"\
acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.02141724293632158\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334335,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334335\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7461538461538462,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.7461538461538462,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\
\ \"acc_stderr\": 0.02758406660220827,\n \"acc_norm\": 0.7847533632286996,\n\
\ \"acc_norm_stderr\": 0.02758406660220827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8466257668711656,\n \"acc_stderr\": 0.028311601441438607,\n\
\ \"acc_norm\": 0.8466257668711656,\n \"acc_norm_stderr\": 0.028311601441438607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8441890166028098,\n\
\ \"acc_stderr\": 0.012969269247762578,\n \"acc_norm\": 0.8441890166028098,\n\
\ \"acc_norm_stderr\": 0.012969269247762578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5150837988826815,\n\
\ \"acc_stderr\": 0.01671489037999606,\n \"acc_norm\": 0.5150837988826815,\n\
\ \"acc_norm_stderr\": 0.01671489037999606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.02161380939522479,\n\
\ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.02161380939522479\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5867014341590613,\n\
\ \"acc_stderr\": 0.012576779494860076,\n \"acc_norm\": 0.5867014341590613,\n\
\ \"acc_norm_stderr\": 0.012576779494860076\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146606,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146606\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.02688214492230774,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.02688214492230774\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.017240861812099804,\n \"mc2\": 0.592821117756168,\n\
\ \"mc2_stderr\": 0.015249093012153285\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.010833276515007508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4624715693707354,\n \
\ \"acc_stderr\": 0.013733636059107756\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/MegaDolphin-120b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|arc:challenge|25_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|gsm8k|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hellaswag|10_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T21-48-58.549252.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- '**/details_harness|winogrande|5_2024-01-21T21-48-58.549252.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T21-48-58.549252.parquet'
- config_name: results
data_files:
- split: 2024_01_21T21_48_58.549252
path:
- results_2024-01-21T21-48-58.549252.parquet
- split: latest
path:
- results_2024-01-21T21-48-58.549252.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/MegaDolphin-120b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/MegaDolphin-120b](https://huggingface.co/cognitivecomputations/MegaDolphin-120b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T21:48:58.549252](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__MegaDolphin-120b/blob/main/results_2024-01-21T21-48-58.549252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6902282983987049,
"acc_stderr": 0.03061680900457877,
"acc_norm": 0.6956826713561578,
"acc_norm_stderr": 0.03120808276540501,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.017240861812099804,
"mc2": 0.592821117756168,
"mc2_stderr": 0.015249093012153285
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620196,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238361
},
"harness|hellaswag|10": {
"acc": 0.7026488747261501,
"acc_stderr": 0.004561582009834577,
"acc_norm": 0.8780123481378211,
"acc_norm_stderr": 0.0032660269509226444
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.0271342916287417,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.0271342916287417
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802268,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802268
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388525,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632158,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632158
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334335,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7461538461538462,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.7461538461538462,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868834,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.02758406660220827,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.02758406660220827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8466257668711656,
"acc_stderr": 0.028311601441438607,
"acc_norm": 0.8466257668711656,
"acc_norm_stderr": 0.028311601441438607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8441890166028098,
"acc_stderr": 0.012969269247762578,
"acc_norm": 0.8441890166028098,
"acc_norm_stderr": 0.012969269247762578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5150837988826815,
"acc_stderr": 0.01671489037999606,
"acc_norm": 0.5150837988826815,
"acc_norm_stderr": 0.01671489037999606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.02161380939522479,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.02161380939522479
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5867014341590613,
"acc_stderr": 0.012576779494860076,
"acc_norm": 0.5867014341590613,
"acc_norm_stderr": 0.012576779494860076
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740533,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740533
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146606,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146606
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.02688214492230774,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.02688214492230774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.017240861812099804,
"mc2": 0.592821117756168,
"mc2_stderr": 0.015249093012153285
},
"harness|winogrande|5": {
"acc": 0.8184688239936859,
"acc_stderr": 0.010833276515007508
},
"harness|gsm8k|5": {
"acc": 0.4624715693707354,
"acc_stderr": 0.013733636059107756
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PanoEvJ/GenAI-sample | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 3211
num_examples: 5
download_size: 0
dataset_size: 3211
---
# Dataset Card for "GenAI-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lalanikarim/glaive-function-calling-v2 | ---
language:
- en
license: apache-2.0
dataset_info:
features:
- name: chat
dtype: string
- name: system
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 157115139.42965484
num_examples: 71678
- name: test
num_bytes: 17458942.570345167
num_examples: 7965
download_size: 47435441
dataset_size: 174574082.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
claudiostu/voz1 | ---
license: openrail
---
|
ControlNet/LAV-DF | ---
language:
- en
tags:
- deepfakes
- video
pretty_name: LAV-DF
task_categories:
- video-classification
paperswithcode_id: lav-df
license: cc
size_categories:
- 100K<n<1M
extra_gated_heading: Access LAV-DF dataset on Hugging Face
extra_gated_prompt: To use this LAV-DF dataset, you should agree the [terms and conditions](https://github.com/ControlNet/LAV-DF/blob/master/TERMS_AND_CONDITIONS.md).
---
# Localized Audio Visual DeepFake Dataset (LAV-DF)
This repo is the dataset for the DICTA paper [Do You Really Mean That? Content Driven Audio-Visual
Deepfake Dataset and Multimodal Method for Temporal Forgery Localization](https://ieeexplore.ieee.org/document/10034605)
(Best Award), and the journal paper ["Glitch in the Matrix!": A Large Scale Benchmark for Content Driven Audio-Visual
Forgery Detection and Localization](https://arxiv.org/abs/2305.01979) submitted to CVIU.
## LAV-DF Dataset
### Download
To use this LAV-DF dataset, you should agree the [terms and conditions](https://github.com/ControlNet/LAV-DF/blob/master/TERMS_AND_CONDITIONS.md).
Download link: [Google Drive](https://drive.google.com/file/d/1-OQ-NDtdEyqHNLaZU1Lt9Upk5wVqfYJw/view?usp=sharing), or [HuggingFace](https://huggingface.co/datasets/ControlNet/LAV-DF).
## License
This project is under the CC BY-NC 4.0 license. See [LICENSE](https://github.com/ControlNet/LAV-DF/blob/master/LICENSE) for details.
## References
If you find this work useful in your research, please cite them.
The conference paper,
```bibtex
@inproceedings{cai2022you,
title = {Do You Really Mean That? Content Driven Audio-Visual Deepfake Dataset and Multimodal Method for Temporal Forgery Localization},
author = {Cai, Zhixi and Stefanov, Kalin and Dhall, Abhinav and Hayat, Munawar},
booktitle = {2022 International Conference on Digital Image Computing: Techniques and Applications (DICTA)},
year = {2022},
doi = {10.1109/DICTA56598.2022.10034605},
pages = {1--10},
address = {Sydney, Australia},
}
```
The extended journal version currently under review at CVIU,
```bibtex
@article{cai2023glitch,
title = {Glitch in the Matrix: A Large Scale Benchmark for Content Driven Audio-Visual Forgery Detection and Localization},
author = {Cai, Zhixi and Ghosh, Shreya and Dhall, Abhinav and Gedeon, Tom and Stefanov, Kalin and Hayat, Munawar},
journal = {arXiv preprint arXiv:2305.01979},
year = {2023},
}
``` |
HiTZ/casimedicos-squad | ---
license: cc-by-4.0
language:
- es
tags:
- casimedicos
- explainability
- medical exams
- medical question answering
- extractive question answering
- squad
- multilinguality
- LLMs
- LLM
pretty_name: casimedicos-squad
configs:
- config_name: es
data_files:
- split: train
path:
- data/es/es_train_casimedicos_squad.json
- split: validation
path:
- data/es/es_dev_casimedicos_squad.json
- split: test
path:
- data/es/es_test_casimedicos_squad.json
task_categories:
- question-answering
size_categories:
- 1K<n<10K
---
<p align="center">
<br>
<img src="http://www.ixa.eus/sites/default/files/anitdote.png" style="height: 200px;">
<br>
# Antidote CasiMedicos in SQuAD Format for Explanatory Argument Extraction
We present a new multilingual parallel medical dataset of commented medical exams which includes not only explanatory arguments
for the correct answer but also arguments to explain why the remaining possible answers are incorrect.
Furthermore, this dataset allows us to setup a **novel extractive task**
which consists of **identifying the explanation of the correct answer written by
medical doctors**. In order to do so we leverage the
SQuAD extractive QA paradigm to automatically evaluate performance of language models to identify the explanation of the correct answer
in medical exams without relying on costly manual evaluation by medical experts.
The data source consists of Resident Medical Intern or Mรฉdico Interno Residente (MIR) exams, originally
created by [CasiMedicos](https://www.casimedicos.com), a Spanish community of medical professionals who collaboratively, voluntarily,
and free of charge, publishes written explanations about the possible answers included in the MIR exams. The aim is to generate a resource that
helps future medical doctors to study towards the MIR examinations. The commented MIR exams, including the explanations, are published in the [CasiMedicos
Project MIR 2.0 website](https://www.casimedicos.com/mir-2-0/).
We have extracted, clean, structure and annotated the available data so that each document in **casimedicos-squad** includes the clinical case, the correct answer,
the multiple-choice questions and the commented exam written by native Spanish medical doctors. The comments have been annotated with the span in the text that
corresponds to the explanation of the correct answer (see example below).
<table style="width:33%">
<tr>
<th>casimedicos-squad splits</th>
<tr>
<td>train</td>
<td>404</td>
</tr>
<tr>
<td>validation</td>
<td>56</td>
</tr>
<tr>
<td>test</td>
<td>119</td>
</tr>
</table>
- ๐ Paper:[Explanatory Argument Extraction of Correct Answers in Resident Medical Exams](https://arxiv.org/abs/2312.00567)
- ๐ป Github Repo (Data and Code): [https://github.com/ixa-ehu/antidote-casimedicos](https://github.com/ixa-ehu/antidote-casimedicos)
- ๐ Project Website: [https://univ-cotedazur.eu/antidote](https://univ-cotedazur.eu/antidote)
- Funding: CHIST-ERA XAI 2019 call. Antidote (PCI2020-120717-2) funded by MCIN/AEI /10.13039/501100011033 and by European Union NextGenerationEU/PRTR
## Example
<p align="center">
<img src="https://github.com/ixa-ehu/antidote-casimedicos/blob/main/casimedicos-exp.png?raw=true" style="height: 650px;">
</p>
The example above shows a document in CasiMedicos containing the textual content, including Clinical Case (C), Question (Q), Possible Answers (P),
and Explanation (E). Furthermore, for **casimedicos-squad** we annotated the span in the explanation (E) that corresponds to the correct answer (A).
The process of manually annotating the corpus consisted of specifying where the explanations of the correct answers begin and end.
In order to obtain grammatically complete correct answer explanations, annotating full sentences or subordinate clauses was preferred over
shorter spans.
## Data Explanation
The dataset is structured as a list of documents ("paragraphs") where each of them include:
- **context**: the explanation (E) in the document
- **qas**: list of possible answers and questions. This element contains:
- **answers**: an answer which corresponds to the explanation of the correct answer (A)
- **question**: the clinical case (C) and question (Q)
- **id**: unique identifier for the document
## Citation
If you use this data please **cite the following paper**:
```bibtex
@misc{goenaga2023explanatory,
title={Explanatory Argument Extraction of Correct Answers in Resident Medical Exams},
author={Iakes Goenaga and Aitziber Atutxa and Koldo Gojenola and Maite Oronoz and Rodrigo Agerri},
year={2023},
eprint={2312.00567},
archivePrefix={arXiv}
}
```
**Contact**: [Iakes Goenaga](http://www.hitz.eus/es/node/65) and [Rodrigo Agerri](https://ragerri.github.io/)
HiTZ Center - Ixa, University of the Basque Country UPV/EHU |
NobodyExistsOnTheInternet/ConvoOrcaShareGPT4096 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: evolutions
sequence: string
- name: textversion
dtype: string
- name: tokens
dtype: int64
- name: too_long
dtype: bool
splits:
- name: train
num_bytes: 1808371367
num_examples: 46563
download_size: 882548173
dataset_size: 1808371367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jondurbin__bagel-34b-v0.2 | ---
pretty_name: Evaluation run of jondurbin/bagel-34b-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/bagel-34b-v0.2](https://huggingface.co/jondurbin/bagel-34b-v0.2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__bagel-34b-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T02:46:07.466495](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-34b-v0.2/blob/main/results_2024-01-05T02-46-07.466495.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7566584880323868,\n\
\ \"acc_stderr\": 0.028404446518444006,\n \"acc_norm\": 0.7644443889276894,\n\
\ \"acc_norm_stderr\": 0.028932547734181486,\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.017363844503195985,\n \"mc2\": 0.592598246243346,\n\
\ \"mc2_stderr\": 0.014870176336077599\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.01399057113791876,\n\
\ \"acc_norm\": 0.6877133105802048,\n \"acc_norm_stderr\": 0.013542598541688065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6347341167098187,\n\
\ \"acc_stderr\": 0.004805205798724566,\n \"acc_norm\": 0.8371838279227246,\n\
\ \"acc_norm_stderr\": 0.0036844333238877946\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549915,\n\
\ \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549915\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866514,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866514\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n\
\ \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n\
\ \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562429,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562429\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838742,\n\
\ \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838742\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7037037037037037,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n\
\ \"acc_stderr\": 0.043758884927270585,\n \"acc_norm\": 0.6031746031746031,\n\
\ \"acc_norm_stderr\": 0.043758884927270585\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8903225806451613,\n\
\ \"acc_stderr\": 0.01777677870048519,\n \"acc_norm\": 0.8903225806451613,\n\
\ \"acc_norm_stderr\": 0.01777677870048519\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.03366124489051449,\n\
\ \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"\
acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295138,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295138\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.019565236782930893,\n\
\ \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.019565236782930893\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43333333333333335,\n \"acc_stderr\": 0.030213340289237924,\n \
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.030213340289237924\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8403361344537815,\n \"acc_stderr\": 0.023793353997528802,\n\
\ \"acc_norm\": 0.8403361344537815,\n \"acc_norm_stderr\": 0.023793353997528802\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334872,\n \"\
acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334872\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342327,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342327\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.0291998024556228,\n \"acc_norm\"\
: 0.8842975206611571,\n \"acc_norm_stderr\": 0.0291998024556228\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8588957055214724,\n \"acc_stderr\": 0.027351605518389752,\n\
\ \"acc_norm\": 0.8588957055214724,\n \"acc_norm_stderr\": 0.027351605518389752\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331356,\n\
\ \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253869,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253869\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n\
\ \"acc_stderr\": 0.010397417087292847,\n \"acc_norm\": 0.9067688378033205,\n\
\ \"acc_norm_stderr\": 0.010397417087292847\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135022,\n\
\ \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135022\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.013378001241813075,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.013378001241813075\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362303,\n\
\ \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n\
\ \"acc_stderr\": 0.022552447780478033,\n \"acc_norm\": 0.8038585209003215,\n\
\ \"acc_norm_stderr\": 0.022552447780478033\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.019420260109438293,\n\
\ \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.019420260109438293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6276595744680851,\n \"acc_stderr\": 0.02883892147125145,\n \
\ \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.02883892147125145\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5847457627118644,\n\
\ \"acc_stderr\": 0.012585471793400664,\n \"acc_norm\": 0.5847457627118644,\n\
\ \"acc_norm_stderr\": 0.012585471793400664\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8051470588235294,\n \"acc_stderr\": 0.02406059942348742,\n\
\ \"acc_norm\": 0.8051470588235294,\n \"acc_norm_stderr\": 0.02406059942348742\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8104575163398693,\n \"acc_stderr\": 0.015856152189980256,\n \
\ \"acc_norm\": 0.8104575163398693,\n \"acc_norm_stderr\": 0.015856152189980256\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.02019067053502791,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.02019067053502791\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4369645042839657,\n\
\ \"mc1_stderr\": 0.017363844503195985,\n \"mc2\": 0.592598246243346,\n\
\ \"mc2_stderr\": 0.014870176336077599\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.46171341925701287,\n \
\ \"acc_stderr\": 0.013732048227016682\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/bagel-34b-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|arc:challenge|25_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|gsm8k|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hellaswag|10_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-46-07.466495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T02-46-07.466495.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- '**/details_harness|winogrande|5_2024-01-05T02-46-07.466495.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T02-46-07.466495.parquet'
- config_name: results
data_files:
- split: 2024_01_05T02_46_07.466495
path:
- results_2024-01-05T02-46-07.466495.parquet
- split: latest
path:
- results_2024-01-05T02-46-07.466495.parquet
---
# Dataset Card for Evaluation run of jondurbin/bagel-34b-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jondurbin/bagel-34b-v0.2](https://huggingface.co/jondurbin/bagel-34b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__bagel-34b-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T02:46:07.466495](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-34b-v0.2/blob/main/results_2024-01-05T02-46-07.466495.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7566584880323868,
"acc_stderr": 0.028404446518444006,
"acc_norm": 0.7644443889276894,
"acc_norm_stderr": 0.028932547734181486,
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195985,
"mc2": 0.592598246243346,
"mc2_stderr": 0.014870176336077599
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.01399057113791876,
"acc_norm": 0.6877133105802048,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.6347341167098187,
"acc_stderr": 0.004805205798724566,
"acc_norm": 0.8371838279227246,
"acc_norm_stderr": 0.0036844333238877946
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549915,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549915
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866514,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866514
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.026556982117838742,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.026556982117838742
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.043758884927270585,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.043758884927270585
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8903225806451613,
"acc_stderr": 0.01777677870048519,
"acc_norm": 0.8903225806451613,
"acc_norm_stderr": 0.01777677870048519
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295138,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295138
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.019565236782930893,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.019565236782930893
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.030213340289237924,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.030213340289237924
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8403361344537815,
"acc_stderr": 0.023793353997528802,
"acc_norm": 0.8403361344537815,
"acc_norm_stderr": 0.023793353997528802
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334872,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334872
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342327,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342327
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.0291998024556228,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.0291998024556228
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8588957055214724,
"acc_stderr": 0.027351605518389752,
"acc_norm": 0.8588957055214724,
"acc_norm_stderr": 0.027351605518389752
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331356,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253869,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253869
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292847,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292847
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135022,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135022
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8,
"acc_stderr": 0.013378001241813075,
"acc_norm": 0.8,
"acc_norm_stderr": 0.013378001241813075
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362303,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.022552447780478033,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.022552447780478033
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.019420260109438293,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.019420260109438293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5847457627118644,
"acc_stderr": 0.012585471793400664,
"acc_norm": 0.5847457627118644,
"acc_norm_stderr": 0.012585471793400664
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8051470588235294,
"acc_stderr": 0.02406059942348742,
"acc_norm": 0.8051470588235294,
"acc_norm_stderr": 0.02406059942348742
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8104575163398693,
"acc_stderr": 0.015856152189980256,
"acc_norm": 0.8104575163398693,
"acc_norm_stderr": 0.015856152189980256
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.02019067053502791,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.02019067053502791
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4369645042839657,
"mc1_stderr": 0.017363844503195985,
"mc2": 0.592598246243346,
"mc2_stderr": 0.014870176336077599
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292406
},
"harness|gsm8k|5": {
"acc": 0.46171341925701287,
"acc_stderr": 0.013732048227016682
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rafatecno1/rafa | ---
license: openrail
---
|
alvations/c4p0-x1-en-fr | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
splits:
- name: train
num_bytes: 4047
num_examples: 2
download_size: 21379
dataset_size: 4047
configs:
- config_name: default
data_files:
- split: train
path: 37ded8bbdbf8c054/train-*
---
|
pib | ---
task_categories:
- translation
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
multilinguality:
- translation
language:
- bn
- en
- gu
- hi
- ml
- mr
- or
- pa
- ta
- te
- ur
language_creators:
- other
annotations_creators:
- no-annotation
source_datasets:
- original
size_categories:
- 100K<n<1M
- 10K<n<100K
license:
- cc-by-4.0
paperswithcode_id: null
pretty_name: CVIT PIB
dataset_info:
- config_name: or-ur
features:
- name: translation
dtype:
translation:
languages:
- or
- ur
splits:
- name: train
num_bytes: 27790211
num_examples: 43766
download_size: 393352875
dataset_size: 27790211
- config_name: ml-or
features:
- name: translation
dtype:
translation:
languages:
- ml
- or
splits:
- name: train
num_bytes: 16011549
num_examples: 19413
download_size: 393352875
dataset_size: 16011549
- config_name: bn-ta
features:
- name: translation
dtype:
translation:
languages:
- bn
- ta
splits:
- name: train
num_bytes: 28706668
num_examples: 33005
download_size: 393352875
dataset_size: 28706668
- config_name: gu-mr
features:
- name: translation
dtype:
translation:
languages:
- gu
- mr
splits:
- name: train
num_bytes: 24253770
num_examples: 30766
download_size: 393352875
dataset_size: 24253770
- config_name: hi-or
features:
- name: translation
dtype:
translation:
languages:
- hi
- or
splits:
- name: train
num_bytes: 45086618
num_examples: 61070
download_size: 393352875
dataset_size: 45086618
- config_name: en-or
features:
- name: translation
dtype:
translation:
languages:
- en
- or
splits:
- name: train
num_bytes: 51258494
num_examples: 98230
download_size: 393352875
dataset_size: 51258494
- config_name: mr-ur
features:
- name: translation
dtype:
translation:
languages:
- mr
- ur
splits:
- name: train
num_bytes: 34053295
num_examples: 49691
download_size: 393352875
dataset_size: 34053295
- config_name: en-ta
features:
- name: translation
dtype:
translation:
languages:
- en
- ta
splits:
- name: train
num_bytes: 74931542
num_examples: 118759
download_size: 393352875
dataset_size: 74931542
- config_name: hi-ta
features:
- name: translation
dtype:
translation:
languages:
- hi
- ta
splits:
- name: train
num_bytes: 57628429
num_examples: 64945
download_size: 393352875
dataset_size: 57628429
- config_name: bn-en
features:
- name: translation
dtype:
translation:
languages:
- bn
- en
splits:
- name: train
num_bytes: 53291968
num_examples: 93560
download_size: 393352875
dataset_size: 53291968
- config_name: bn-or
features:
- name: translation
dtype:
translation:
languages:
- bn
- or
splits:
- name: train
num_bytes: 19819136
num_examples: 26456
download_size: 393352875
dataset_size: 19819136
- config_name: ml-ta
features:
- name: translation
dtype:
translation:
languages:
- ml
- ta
splits:
- name: train
num_bytes: 21685938
num_examples: 23609
download_size: 393352875
dataset_size: 21685938
- config_name: gu-ur
features:
- name: translation
dtype:
translation:
languages:
- gu
- ur
splits:
- name: train
num_bytes: 20312414
num_examples: 29938
download_size: 393352875
dataset_size: 20312414
- config_name: bn-ml
features:
- name: translation
dtype:
translation:
languages:
- bn
- ml
splits:
- name: train
num_bytes: 15545271
num_examples: 18149
download_size: 393352875
dataset_size: 15545271
- config_name: ml-pa
features:
- name: translation
dtype:
translation:
languages:
- ml
- pa
splits:
- name: train
num_bytes: 18114904
num_examples: 21978
download_size: 393352875
dataset_size: 18114904
- config_name: en-pa
features:
- name: translation
dtype:
translation:
languages:
- en
- pa
splits:
- name: train
num_bytes: 56316514
num_examples: 103296
download_size: 393352875
dataset_size: 56316514
- config_name: bn-hi
features:
- name: translation
dtype:
translation:
languages:
- bn
- hi
splits:
- name: train
num_bytes: 40970170
num_examples: 49598
download_size: 393352875
dataset_size: 40970170
- config_name: hi-pa
features:
- name: translation
dtype:
translation:
languages:
- hi
- pa
splits:
- name: train
num_bytes: 59293062
num_examples: 75200
download_size: 393352875
dataset_size: 59293062
- config_name: gu-te
features:
- name: translation
dtype:
translation:
languages:
- gu
- te
splits:
- name: train
num_bytes: 14517828
num_examples: 16335
download_size: 393352875
dataset_size: 14517828
- config_name: pa-ta
features:
- name: translation
dtype:
translation:
languages:
- pa
- ta
splits:
- name: train
num_bytes: 39144065
num_examples: 46349
download_size: 393352875
dataset_size: 39144065
- config_name: hi-ml
features:
- name: translation
dtype:
translation:
languages:
- hi
- ml
splits:
- name: train
num_bytes: 24015298
num_examples: 27167
download_size: 393352875
dataset_size: 24015298
- config_name: or-te
features:
- name: translation
dtype:
translation:
languages:
- or
- te
splits:
- name: train
num_bytes: 9011734
num_examples: 10475
download_size: 393352875
dataset_size: 9011734
- config_name: en-ml
features:
- name: translation
dtype:
translation:
languages:
- en
- ml
splits:
- name: train
num_bytes: 27754969
num_examples: 44986
download_size: 393352875
dataset_size: 27754969
- config_name: en-hi
features:
- name: translation
dtype:
translation:
languages:
- en
- hi
splits:
- name: train
num_bytes: 160009440
num_examples: 269594
download_size: 393352875
dataset_size: 160009440
- config_name: bn-pa
features:
- name: translation
dtype:
translation:
languages:
- bn
- pa
splits:
- name: train
num_bytes: 27522373
num_examples: 35109
download_size: 393352875
dataset_size: 27522373
- config_name: mr-te
features:
- name: translation
dtype:
translation:
languages:
- mr
- te
splits:
- name: train
num_bytes: 16838115
num_examples: 18179
download_size: 393352875
dataset_size: 16838115
- config_name: mr-pa
features:
- name: translation
dtype:
translation:
languages:
- mr
- pa
splits:
- name: train
num_bytes: 38720410
num_examples: 50418
download_size: 393352875
dataset_size: 38720410
- config_name: bn-te
features:
- name: translation
dtype:
translation:
languages:
- bn
- te
splits:
- name: train
num_bytes: 15529843
num_examples: 17605
download_size: 393352875
dataset_size: 15529843
- config_name: gu-hi
features:
- name: translation
dtype:
translation:
languages:
- gu
- hi
splits:
- name: train
num_bytes: 33606230
num_examples: 41587
download_size: 393352875
dataset_size: 33606230
- config_name: ta-ur
features:
- name: translation
dtype:
translation:
languages:
- ta
- ur
splits:
- name: train
num_bytes: 37593813
num_examples: 48892
download_size: 393352875
dataset_size: 37593813
- config_name: te-ur
features:
- name: translation
dtype:
translation:
languages:
- te
- ur
splits:
- name: train
num_bytes: 16485209
num_examples: 21148
download_size: 393352875
dataset_size: 16485209
- config_name: or-pa
features:
- name: translation
dtype:
translation:
languages:
- or
- pa
splits:
- name: train
num_bytes: 30081903
num_examples: 43159
download_size: 393352875
dataset_size: 30081903
- config_name: gu-ml
features:
- name: translation
dtype:
translation:
languages:
- gu
- ml
splits:
- name: train
num_bytes: 15749821
num_examples: 18252
download_size: 393352875
dataset_size: 15749821
- config_name: gu-pa
features:
- name: translation
dtype:
translation:
languages:
- gu
- pa
splits:
- name: train
num_bytes: 27441041
num_examples: 35566
download_size: 393352875
dataset_size: 27441041
- config_name: hi-te
features:
- name: translation
dtype:
translation:
languages:
- hi
- te
splits:
- name: train
num_bytes: 26473814
num_examples: 28569
download_size: 393352875
dataset_size: 26473814
- config_name: en-te
features:
- name: translation
dtype:
translation:
languages:
- en
- te
splits:
- name: train
num_bytes: 28620219
num_examples: 44888
download_size: 393352875
dataset_size: 28620219
- config_name: ml-te
features:
- name: translation
dtype:
translation:
languages:
- ml
- te
splits:
- name: train
num_bytes: 9690153
num_examples: 10480
download_size: 393352875
dataset_size: 9690153
- config_name: pa-ur
features:
- name: translation
dtype:
translation:
languages:
- pa
- ur
splits:
- name: train
num_bytes: 34959176
num_examples: 51831
download_size: 393352875
dataset_size: 34959176
- config_name: hi-ur
features:
- name: translation
dtype:
translation:
languages:
- hi
- ur
splits:
- name: train
num_bytes: 81262590
num_examples: 109951
download_size: 393352875
dataset_size: 81262590
- config_name: mr-or
features:
- name: translation
dtype:
translation:
languages:
- mr
- or
splits:
- name: train
num_bytes: 33998805
num_examples: 47001
download_size: 393352875
dataset_size: 33998805
- config_name: en-ur
features:
- name: translation
dtype:
translation:
languages:
- en
- ur
splits:
- name: train
num_bytes: 100571795
num_examples: 202578
download_size: 393352875
dataset_size: 100571795
- config_name: ml-ur
features:
- name: translation
dtype:
translation:
languages:
- ml
- ur
splits:
- name: train
num_bytes: 15663718
num_examples: 20913
download_size: 393352875
dataset_size: 15663718
- config_name: bn-mr
features:
- name: translation
dtype:
translation:
languages:
- bn
- mr
splits:
- name: train
num_bytes: 27604502
num_examples: 34043
download_size: 393352875
dataset_size: 27604502
- config_name: gu-ta
features:
- name: translation
dtype:
translation:
languages:
- gu
- ta
splits:
- name: train
num_bytes: 25089131
num_examples: 29187
download_size: 393352875
dataset_size: 25089131
- config_name: pa-te
features:
- name: translation
dtype:
translation:
languages:
- pa
- te
splits:
- name: train
num_bytes: 23119690
num_examples: 25684
download_size: 393352875
dataset_size: 23119690
- config_name: bn-gu
features:
- name: translation
dtype:
translation:
languages:
- bn
- gu
splits:
- name: train
num_bytes: 19899277
num_examples: 25166
download_size: 393352875
dataset_size: 19899277
- config_name: bn-ur
features:
- name: translation
dtype:
translation:
languages:
- bn
- ur
splits:
- name: train
num_bytes: 27540215
num_examples: 39290
download_size: 393352875
dataset_size: 27540215
- config_name: ml-mr
features:
- name: translation
dtype:
translation:
languages:
- ml
- mr
splits:
- name: train
num_bytes: 19723458
num_examples: 22796
download_size: 393352875
dataset_size: 19723458
- config_name: or-ta
features:
- name: translation
dtype:
translation:
languages:
- or
- ta
splits:
- name: train
num_bytes: 35357904
num_examples: 44035
download_size: 393352875
dataset_size: 35357904
- config_name: ta-te
features:
- name: translation
dtype:
translation:
languages:
- ta
- te
splits:
- name: train
num_bytes: 17415768
num_examples: 17359
download_size: 393352875
dataset_size: 17415768
- config_name: gu-or
features:
- name: translation
dtype:
translation:
languages:
- gu
- or
splits:
- name: train
num_bytes: 20111876
num_examples: 27162
download_size: 393352875
dataset_size: 20111876
- config_name: en-gu
features:
- name: translation
dtype:
translation:
languages:
- en
- gu
splits:
- name: train
num_bytes: 33630906
num_examples: 59739
download_size: 393352875
dataset_size: 33630906
- config_name: hi-mr
features:
- name: translation
dtype:
translation:
languages:
- hi
- mr
splits:
- name: train
num_bytes: 55680473
num_examples: 69186
download_size: 393352875
dataset_size: 55680473
- config_name: mr-ta
features:
- name: translation
dtype:
translation:
languages:
- mr
- ta
splits:
- name: train
num_bytes: 41585343
num_examples: 48535
download_size: 393352875
dataset_size: 41585343
- config_name: en-mr
features:
- name: translation
dtype:
translation:
languages:
- en
- mr
splits:
- name: train
num_bytes: 65042597
num_examples: 117199
download_size: 393352875
dataset_size: 65042597
config_names:
- bn-en
- bn-gu
- bn-hi
- bn-ml
- bn-mr
- bn-or
- bn-pa
- bn-ta
- bn-te
- bn-ur
- en-gu
- en-hi
- en-ml
- en-mr
- en-or
- en-pa
- en-ta
- en-te
- en-ur
- gu-hi
- gu-ml
- gu-mr
- gu-or
- gu-pa
- gu-ta
- gu-te
- gu-ur
- hi-ml
- hi-mr
- hi-or
- hi-pa
- hi-ta
- hi-te
- hi-ur
- ml-mr
- ml-or
- ml-pa
- ml-ta
- ml-te
- ml-ur
- mr-or
- mr-pa
- mr-ta
- mr-te
- mr-ur
- or-pa
- or-ta
- or-te
- or-ur
- pa-ta
- pa-te
- pa-ur
- ta-te
- ta-ur
- te-ur
---
# Dataset Card for CVIT PIB
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://preon.iiit.ac.in/~jerin/bhasha/
- **Paper:** https://arxiv.org/abs/2008.04860
- **Point of Contact:** [Mailing List](cvit-bhasha@googlegroups.com)
### Dataset Summary
This dataset is the large scale sentence aligned corpus in 11 Indian languages, viz. CVIT-PIB corpus that is the largest multilingual corpus available for Indian languages.
### Supported Tasks and Leaderboards
- Machine Translation
### Languages
Parallel data for following languages [en, bn, gu, hi, ml, mr, pa, or, ta, te, ur] are covered.
## Dataset Structure
### Data Instances
An example for the "gu-pa" language pair:
```
{
'translation': {
'gu': 'เชเชตเซ เชจเชฟเชฐเซเชฃเชฏ เชฒเซเชตเชพเชฏเซ เชนเชคเซ เชเซ เชเชเชคเชชเซเชฐเซเชตเชเชจเซ เชเชพเชฎเชเซเชฐเซ เชนเชพเชฅ เชงเชฐเชตเชพ, เชเชพเชฏเชฆเซเชธเชฐ เช
เชจเซ เชเซเชเชจเชฟเชเชฒ เชฎเซเชฒเซเชฏเชพเชเชเชจ เชเชฐเชตเชพ, เชตเซเชจเซเชเชฐ เชเซเชชเชฟเชเชฒ เชเชจเซเชตเซเชธเซเชเชฎเซเชจเซเช เชธเชฎเชฟเชคเชฟเชจเซ เชฌเซเช เช เชฏเซเชเชตเชพ เชตเชเซเชฐเซ เชเชเชเชเชซเชจเซ เชเชฐเชตเชพเชฎเชพเช เชเชตเซเชฒ เชชเซเชฐเชคเชฟเชฌเชฆเซเชงเชคเชพเชจเชพ 0.50 เชเชเชพ เชธเซเชงเซ เช
เชจเซ เชฌเชพเชเซเชจเซ เชฐเชเชฎ เชเชซเชเชซเชเชธเชจเซ เชชเซเชฐเซเชฃ เชเชฐเชตเชพเชฎเชพเช เชเชตเชถเซ.',
'pa': 'เจเจน เจตเฉ เจซเฉเจธเจฒเจพ เจเฉเจคเจพ เจเจฟเจ เจเจฟ เจเฉฑเจซเจเจเจเจ เจ
เจคเฉ เจฌเจเจพเจ เจฒเจ เจเฉเจคเฉเจเจ เจเจเจเจ เจตเจเจจเจฌเฉฑเจงเจคเจพเจตเจพเจ เจฆเฉ 0.50 % เจฆเฉ เจธเฉเจฎเจพ เจคเฉฑเจ เจเฉฑเจซเจเจเฉฑเจธ เจจเฉเฉฐ เจฎเจฟเจฒเจฟเจ เจเจพเจเจเจพ, เจเจธ เจจเจพเจฒ เจเฉฑเจฆเจฎ เจชเฉเฉฐเจเฉ เจจเจฟเจตเฉเจธเจผ เจเจฎเฉเจเฉ เจฆเฉ เจฌเฉเจ เจ เจฆเจพ เจเจฏเฉเจเจจ เจเจเจฟเจค เจธเจพเจตเจงเจพเจจเฉ, เจเจพเจจเฉเฉฐเจจเฉ เจ
เจคเฉ เจคเจเจจเฉเจเฉ เจฎเฉเฉฑเจฒเจพเจเจเจฃ เจฒเจ เจธเฉฐเจเจพเจฒเจจ เจเจฐเจ เจเจฆเจฟ เจฆเฉ เจชเฉเจฐเจคเฉ เจนเฉเจตเฉเจเฉเฅค'
}
}
```
### Data Fields
- `translation`: Translation field containing the parallel text for the pair of languages.
### Data Splits
The dataset is in a single "train" split.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[Creative Commons Attribution-ShareAlike 4.0 International](https://creativecommons.org/licenses/by-sa/4.0/) license.
### Citation Information
```
@inproceedings{siripragada-etal-2020-multilingual,
title = "A Multilingual Parallel Corpora Collection Effort for {I}ndian Languages",
author = "Siripragada, Shashank and
Philip, Jerin and
Namboodiri, Vinay P. and
Jawahar, C V",
booktitle = "Proceedings of the 12th Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.462",
pages = "3743--3751",
language = "English",
ISBN = "979-10-95546-34-4",
}
@article{2020,
title={Revisiting Low Resource Status of Indian Languages in Machine Translation},
url={http://dx.doi.org/10.1145/3430984.3431026},
DOI={10.1145/3430984.3431026},
journal={8th ACM IKDD CODS and 26th COMAD},
publisher={ACM},
author={Philip, Jerin and Siripragada, Shashank and Namboodiri, Vinay P. and Jawahar, C. V.},
year={2020},
month={Dec}
}
```
### Contributions
Thanks to [@vasudevgupta7](https://github.com/vasudevgupta7) for adding this dataset,
and [@albertvillanova](https://github.com/albertvillanova) for updating its version. |
joey234/mmlu-jurisprudence-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 2700
num_examples: 5
download_size: 6711
dataset_size: 2700
---
# Dataset Card for "mmlu-jurisprudence-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/c06e4969 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 200
num_examples: 10
download_size: 1394
dataset_size: 200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c06e4969"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
izou3/MaskFormer_DB | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 125142150.192
num_examples: 7968
- name: test
num_bytes: 21986379.57
num_examples: 1405
download_size: 21304292
dataset_size: 147128529.762
---
# Dataset Card for "MaskFormer_DB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vikp/evol_codealpaca_filtered_87k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: quality_prob
dtype: float64
- name: learning_prob
dtype: float64
splits:
- name: train
num_bytes: 194291512.64351812
num_examples: 87705
download_size: 107933444
dataset_size: 194291512.64351812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "evol_codealpaca_filtered_86k"
Filtered version of `theblackcat102/evol-codealpaca-v1`, with manual filtering, and automatic filtering based on quality and learning value classifiers. |
bond005/sova_rudevices | ---
pretty_name: RuDevices
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- ru
license:
- cc-by-4.0
multilinguality:
- monolingual
paperswithcode_id:
size_categories:
- 10K<n<100k
source_datasets:
- extended
task_categories:
- automatic-speech-recognition
- audio-classification
---
# Dataset Card for sova_rudevices
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SOVA RuDevices](https://github.com/sovaai/sova-dataset)
- **Repository:** [SOVA Dataset](https://github.com/sovaai/sova-dataset)
- **Leaderboard:** [The ๐ค Speech Bench](https://huggingface.co/spaces/huggingface/hf-speech-bench)
- **Point of Contact:** [SOVA.ai](mailto:support@sova.ai)
### Dataset Summary
SOVA Dataset is free public STT/ASR dataset. It consists of several parts, one of them is SOVA RuDevices. This part is an acoustic corpus of approximately 100 hours of 16kHz Russian live speech with manual annotating, prepared by [SOVA.ai team](https://github.com/sovaai).
Authors do not divide the dataset into train, validation and test subsets. Therefore, I was compelled to prepare this splitting. The training subset includes more than 82 hours, the validation subset includes approximately 6 hours, and the test subset includes approximately 6 hours too.
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active Hugging Face leaderboard which can be found at https://huggingface.co/spaces/huggingface/hf-speech-bench. The leaderboard ranks models uploaded to the Hub based on their WER.
### Languages
The audio is in Russian.
## Dataset Structure
### Data Instances
A typical data point comprises the audio data, usually called `audio` and its transcription, called `transcription`. Any additional information about the speaker and the passage which contains the transcription is not provided.
```
{'audio': {'path': '/home/bond005/datasets/sova_rudevices/data/train/00003ec0-1257-42d1-b475-db1cd548092e.wav',
'array': array([ 0.00787354, 0.00735474, 0.00714111, ...,
-0.00018311, -0.00015259, -0.00018311]), dtype=float32),
'sampling_rate': 16000},
'transcription': 'ะผะฝะต ะฟะพะปัััะต ััะฐะปะพ'}
```
### Data Fields
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- transcription: the transcription of the audio file.
### Data Splits
This dataset consists of three splits: training, validation, and test. This splitting was realized with accounting of internal structure of SOVA RuDevices (the validation split is based on the subdirectory `0`, and the test split is based on the subdirectory `1` of the original dataset), but audio recordings of the same speakers can be in different splits at the same time (the opposite is not guaranteed).
| | Train | Validation | Test |
| ----- | ------ | ---------- | ----- |
| examples | 81607 | 5835 | 5799 |
| hours | 82.4h | 5.9h | 5.8h |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
All recorded audio files were manually annotated.
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
The dataset was initially created by Egor Zubarev, Timofey Moskalets, and SOVA.ai team.
### Licensing Information
[Creative Commons BY 4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@misc{sova2021rudevices,
author = {Zubarev, Egor and Moskalets, Timofey and SOVA.ai},
title = {SOVA RuDevices Dataset: free public STT/ASR dataset with manually annotated live speech},
publisher = {GitHub},
journal = {GitHub repository},
year = {2021},
howpublished = {\url{https://github.com/sovaai/sova-dataset}},
}
```
### Contributions
Thanks to [@bond005](https://github.com/bond005) for adding this dataset. |
AleAle2423/Table_of_contents | ---
license: mit
language:
- en
tags:
- synthetic
pretty_name: Table of contents data set
size_categories:
- 10K<n<100K
--- |
KaiLv/UDR_CosmosQA | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: question
dtype: string
- name: label
dtype: string
- name: choices
dtype: string
- name: len_question
dtype: int64
- name: max_len_choices
dtype: int64
splits:
- name: train
num_bytes: 11188271
num_examples: 18770
- name: test
num_bytes: 3979297
num_examples: 6030
- name: validation
num_bytes: 1722925
num_examples: 2603
- name: debug
num_bytes: 2985534
num_examples: 5000
download_size: 11095169
dataset_size: 19876027
---
# Dataset Card for "UDR_CosmosQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nlpyeditepe/tr-qnli | ---
annotations_creators:
- found
language_creators:
- machine-generated
language:
- tr-TR
license:
- mit
multilinguality:
- monolingual
pretty_name: QNLI for Turkish
size_categories:
- unknown
source_datasets:
- extended|glue
task_categories:
- text-classification
task_ids:
- natural-language-inference
--- |
anan-2024/twitter_dataset_1713009432 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24140
num_examples: 56
download_size: 12841
dataset_size: 24140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chmourya/openassistant-mourya | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1797495
num_examples: 1500
download_size: 1030938
dataset_size: 1797495
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
salgadev/spirit-patterns | ---
license: lgpl
task_categories:
- image-classification
tags:
- chemistry
- spirits
- beverages
size_categories:
- n<1K
language:
- en
pretty_name: SpiritPatterns
--- |
dubeyam/PairRM-Preference-LIMA-Dataset | ---
license: mit
dataset_info:
features:
- name: data
struct:
- name: '##Question'
dtype: string
- name: '##Best Answer'
dtype: string
- name: '##Worst Answer'
dtype: string
splits:
- name: train
num_bytes: 44058
num_examples: 50
download_size: 33801
dataset_size: 44058
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
terru3/tokenized_tinystories384 | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 5998238715
num_examples: 2119719
- name: validation
num_bytes: 61703030
num_examples: 21990
download_size: 1771825341
dataset_size: 6059941745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
ymoslem/Law-StackExchange | ---
license: cc-by-sa-4.0
task_categories:
- question-answering
- text-classification
- sentence-similarity
language:
- en
tags:
- legal
pretty_name: Law Stack Exchange Questions and Answers
size_categories:
- 10K<n<100K
---
All StackExchange legal questions and their answers from the Law site, up to 14 August 2023. The repository includes a notebook for the process using the official StackExchange API. |
Brizape/Variome_0404 | ---
dataset_info:
features:
- name: id
dtype: string
- name: texts
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 8210534
num_examples: 96
- name: test
num_bytes: 1447445
num_examples: 24
download_size: 972009
dataset_size: 9657979
---
# Dataset Card for "Variome_0404"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ekim15/bone_marrow_cell_dataset | ---
license: cc-by-4.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ABE
'1': ART
'2': BAS
'3': BLA
'4': EBO
'5': EOS
'6': FGC
'7': HAC
'8': KSC
'9': LYI
'10': LYT
'11': MMZ
'12': MON
'13': MYB
'14': NGB
'15': NGS
'16': NIF
'17': OTH
'18': PEB
'19': PLM
'20': PMO
splits:
- name: train
num_bytes: 5531894343.482
num_examples: 137093
- name: validation
num_bytes: 688690986.192
num_examples: 17146
- name: test
num_bytes: 691641698.035
num_examples: 17135
download_size: 6935845206
dataset_size: 6912227027.709001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
task_categories:
- image-classification
- unconditional-image-generation
tags:
- biology
- medical
size_categories:
- 100K<n<1M
---
## About This Dataset
Bone marrow biopsy is procedure applied to collect and examine bone marrow โ the spongy tissue inside some of your larger bones.
This biopsy can show whether your bone marrow is healthy and making normal amounts of blood cells. Doctors use these procedures to diagnose and monitor blood and marrow diseases, cancers, as well as fevers of unknown origin.
The dataset contains a collection of over 170,000 de-identified, expert-annotated cells from the bone marrow smears of 945 patients stained using the May-Grรผnwald-Giemsa/Pappenheim stain. The diagnosis distribution in the cohort included a variety of hematological diseases reflective of the sample entry of a large laboratory specialized in leukemia diagnostics. Image acquisition was performed using a brightfield microscope with 40x magnification and oil immersion.
All samples were processed in the Munich Leukemia Laboratory (MLL), scanned using equipment developed at Fraunhofer IIS and post-processed using software developed at Helmholtz Munich.
## How to Use this dataset
- Create a multi-classification model to predict cell abnormalities;
- Create a binary-classification model to predict if a cell is normal or not.
- Create image generation model to add to training datset
## Acknowledgements
Citation
Matek, C., Krappe, S., Mรผnzenmayer, C., Haferlach, T., & Marr, C. (2021). An Expert-Annotated Dataset of Bone Marrow Cytology in Hematologic Malignancies [Data set]. The Cancer Imaging Archive. https://doi.org/10.7937/TCIA.AXH3-T579
Matek, C., Krappe, S., Mรผnzenmayer, C., Haferlach, T., and Marr, C. (2021). Highly accurate differentiation of bone marrow cell morphologies using deep neural networks on a large image dataset. https://doi.org/10.1182/blood.2020010568
License
CC BY 4.0
https://www.kaggle.com/datasets/andrewmvd/bone-marrow-cell-classification/data
|
neohack22/temporary-dataset | ---
license: apache-2.0
---
|
GabrielVidal/dalle-3-palette | ---
dataset_info:
features:
- name: image
dtype: image
- name: palette
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1778362423
num_examples: 1000
download_size: 1778260113
dataset_size: 1778362423
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc0-1.0
language:
- en
size_categories:
- n<1K
tags:
- image-text-dataset
- synthetic-dataset
--- |
huggingnft/boredapeyachtclub | ---
tags:
- huggingnft
- nft
- huggan
- gan
- image
- images
task:
- unconditional-image-generation
datasets:
- huggingnft/boredapeyachtclub
license: mit
---
# Dataset Card
## Disclaimer
All rights belong to their owners.
Models and datasets can be removed from the site at the request of the copyright holder.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
NFT images dataset for unconditional generation.
NFT collection available [here](https://opensea.io/collection/boredapeyachtclub).
Model is available [here](https://huggingface.co/huggingnft/boredapeyachtclub).
Check Space: [link](https://huggingface.co/spaces/AlekseyKorshuk/huggingnft).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingnft/boredapeyachtclub")
```
## Dataset Structure
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
- `image`: an `image` feature.
- `id`: an `int` feature.
- `token_metadata`: a `str` feature.
- `image_original_url`: a `str` feature.
### Data Splits
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingnft,
author={Aleksey Korshuk}
year=2022
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingnft)
|
renumics/f1_demo_dataset | ---
dataset_info:
features:
- name: Time
dtype: duration[ns]
- name: Driver
dtype: string
- name: DriverNumber
dtype: string
- name: LapTime
dtype: float64
- name: LapNumber
dtype: float64
- name: Stint
dtype: float64
- name: PitOutTime
dtype: duration[ns]
- name: PitInTime
dtype: duration[ns]
- name: Sector1Time
dtype: float64
- name: Sector2Time
dtype: float64
- name: Sector3Time
dtype: float64
- name: Sector1SessionTime
dtype: duration[ns]
- name: Sector2SessionTime
dtype: duration[ns]
- name: Sector3SessionTime
dtype: duration[ns]
- name: SpeedI1
dtype: float64
- name: SpeedI2
dtype: float64
- name: SpeedFL
dtype: float64
- name: SpeedST
dtype: float64
- name: IsPersonalBest
dtype: bool
- name: Compound
dtype: string
- name: TyreLife
dtype: float64
- name: FreshTyre
dtype: bool
- name: Team
dtype: string
- name: LapStartTime
dtype: duration[ns]
- name: LapStartDate
dtype: timestamp[ns]
- name: TrackStatus
dtype: string
- name: Position
dtype: float64
- name: Deleted
dtype: bool
- name: DeletedReason
dtype: string
- name: FastF1Generated
dtype: bool
- name: IsAccurate
dtype: bool
- name: speed
sequence:
sequence: float64
- name: throttle
sequence:
sequence: float64
- name: drs
sequence:
sequence: float64
- name: nGear
sequence:
sequence: float64
- name: brake
sequence:
sequence: float64
- name: x
sequence:
sequence: float64
- name: y
sequence:
sequence: float64
- name: z
sequence:
sequence: float64
- name: distance_driver
sequence:
sequence: float64
- name: speed_emb
sequence: float64
- name: brake_emb
sequence: float64
- name: throttle_emb
sequence: float64
- name: x_emb
dtype: float64
- name: y_emb
dtype: float64
- name: z_emb
dtype: float64
- name: gear_vis
dtype: string
- name: speed_vis
dtype: string
- name: portrait
dtype: string
- name: brake_emb_reduced
sequence: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 22426400
num_examples: 201
download_size: 15371945
dataset_size: 22426400
---
# Dataset Card for "f1_demo_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rcds/lower_court_insertion_swiss_judgment_prediction | ---
annotations_creators:
- expert-generated
language:
- de
- fr
- it
- en
language_creators:
- expert-generated
- found
license:
- cc-by-sa-4.0
multilinguality:
- multilingual
pretty_name: LowerCourtInsertionSwissJudgmentPrediction
size_categories:
- 1K<n<10K
source_datasets:
- extended|swiss_judgment_prediction
tags:
- explainability-judgment-prediction
task_categories:
- text-classification
- other
task_ids: []
---
# Dataset Card for "LowerCourtInsertionSwissJudgmentPrediction": An implementation of lower court insertion bias analysis for Swiss judgment prediction
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Summary](#dataset-summary)
- [Documents](#documents)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset **str**ucture](#dataset-**str**ucture)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Summary
This dataset contains an implementation of lower-court-insertion for the SwissJudgmentPrediction task.
Note that this dataset only provides a test set and should be used in comination with the [Swiss-Judgment-Prediction](https://huggingface.co/datasets/swiss_judgment_prediction) dataset.
### Documents
Lower-Court-Insertion-Swiss-Judgment-Prediction is a subset of the [Swiss-Judgment-Prediction](https://huggingface.co/datasets/swiss_judgment_prediction) dataset.
The Swiss-Judgment-Prediction dataset is a multilingual, diachronic dataset of 85K Swiss Federal Supreme Court (FSCS) cases annotated with the respective binarized judgment outcome (approval/dismissal), the publication year, the legal area and the canton of origin per case. Lower-Court-Insertion-Swiss-Judgment-Prediction extends this dataset by adding lower court insertion.
### Supported Tasks and Leaderboards
LowerCourtInsertionSwissJudgmentPrediction can be used for performing the LowerCourtInsertion in the legal judgment prediction task.
### Languages
Switzerland has four official languages with 3 languages (German, French and Italian) being represented in more than 1000 Swiss Federal Supreme court decisions. The decisions are written by the judges and clerks in the language of the proceedings.
## Dataset structure
### Data Instances
#### Multilingual use of the dataset
When the dataset is used in a multilingual setting selecting the the 'all' flag:
```python
from datasets import load_dataset
dataset = load_dataset('rcds/lower_court_insertion_swiss_judgment_prediction', 'all')
```
#### Monolingual use of the dataset
When the dataset is used in a monolingual setting selecting the ISO language code for one of the 3 supported languages. For example:
```python
from datasets import load_dataset
dataset = load_dataset('rcds/lower-court-insertion_swiss_judgment_prediction', 'de')
```
### Data Fields
The following data fields are provided for documents (test):
id: (**int**) a unique identifier of the for the document<br/>
year: (**int**) the publication year<br/>
label: (**str**) the judgment outcome: dismissal or approval<br/>
language: (**str**) one of (de, fr, it)<br/>
region: (**str**) the region of the lower court<br/>
canton: (**str**) the canton of the lower court<br/>
legal area: (**str**) the legal area of the case<br/>
explainability_label: (**str**) the explainability label assigned to the occluded text: (Lower court, Baseline)<br/>
text: (**str**) the facts of the case w/o the occluded text except for cases w/ explainability label "Baseline" (contain entire facts)<br/>
lower_court: (**str**) the inserted lower_court (for Baseline there is no insertion)<br/>
### Data Splits (Including Swiss Judgment Prediction)
Language | Subset | Number of Rows (Test)
|-----|-----|------|
German| de| __378__
French | fr| __414__
Italian | it| __335__
All | all | __1127__
Language | Subset | Number of Documents (Test)
| ----------- | ----------- | ----------- |
German| de | __38__
French | fr | __36__
Italian | it | __34__
All | all | __108__
## Dataset Creation
### Curation Rationale
The dataset was curated by Niklaus et al. (2021) and Nina Baumgartner.
### Source Data
#### Initial Data Collection and Normalization
The original data are available at the Swiss Federal Supreme Court (https://www.bger.ch) in unprocessed formats (HTML). The documents were downloaded from the Entscheidsuche portal (https://entscheidsuche.ch) in HTML.
#### Who are the source language producers?
Switzerland has four official languages with 3 languages (German, French and Italian) being represented in more than 1000 Swiss Federal Supreme court decisions. The decisions are written by the judges and clerks in the language of the proceedings.
### Annotations
#### Annotation process
The decisions have been annotated with the binarized judgment outcome using parsers and regular expressions. In addition the a subset of the test set (27 cases in German, 24 in French and 23 in Italian spanning over the years 2017 an 20200) was annotated by legal experts with the lower court. These lower court annotations were then use the insert each lower court into each case once (instead of the original lower court). Allowing an analysis of the changes in the models performance for each inserted lower court, giving insight into a possible bias among them. The legal expert annotation were conducted from April 2020 to August 2020.
#### Who are the annotators?
Joel Niklaus and Adrian Jรถrg annotated the binarized judgment outcomes. Metadata is published by the Swiss Federal Supreme Court (https://www.bger.ch). The group of legal experts consists of Thomas Lรผthi (lawyer), Lynn Grau (law student at master's level) and Angela Stefanelli (law student at master's level).
### Personal and Sensitive Information
The dataset contains publicly available court decisions from the Swiss Federal Supreme Court. Personal or sensitive information has been anonymized by the court before publication according to the following guidelines: https://www.bger.ch/home/juridiction/anonymisierungsregeln.html.
## Additional Information
### Dataset Curators
Niklaus et al. (2021) and Nina Baumgartner
### Licensing Information
We release the data under CC-BY-4.0 which complies with the court licensing (https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf)
ยฉ Swiss Federal Supreme Court, 2000-2020
The copyright for the editorial content of this website and the consolidated texts, which is owned by the Swiss Federal Supreme Court, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made.
Source: https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf
### Citation Information
```
@misc{baumgartner_nina_occlusion_2019,
title = {From Occlusion to Transparancy โ An Occlusion-Based Explainability Approach for Legal Judgment Prediction in Switzerland},
shorttitle = {From Occlusion to Transparancy},
abstract = {Natural Language Processing ({NLP}) models have been used for more and more complex tasks such as Legal Judgment Prediction ({LJP}). A {LJP} model predicts the outcome of a legal case by utilizing its facts. This increasing deployment of Artificial Intelligence ({AI}) in high-stakes domains such as law and the involvement of sensitive data has increased the need for understanding such systems. We propose a multilingual occlusion-based explainability approach for {LJP} in Switzerland and conduct a study on the bias using Lower Court Insertion ({LCI}). We evaluate our results using different explainability metrics introduced in this thesis and by comparing them to high-quality Legal Expert Annotations using Inter Annotator Agreement. Our findings show that the model has a varying understanding of the semantic meaning and context of the facts section, and struggles to distinguish between legally relevant and irrelevant sentences. We also found that the insertion of a different lower court can have an effect on the prediction, but observed no distinct effects based on legal areas, cantons, or regions. However, we did identify a language disparity with Italian performing worse than the other languages due to representation inequality in the training data, which could lead to potential biases in the prediction in multilingual regions of Switzerland. Our results highlight the challenges and limitations of using {NLP} in the judicial field and the importance of addressing concerns about fairness, transparency, and potential bias in the development and use of {NLP} systems. The use of explainable artificial intelligence ({XAI}) techniques, such as occlusion and {LCI}, can help provide insight into the decision-making processes of {NLP} systems and identify areas for improvement. Finally, we identify areas for future research and development in this field in order to address the remaining limitations and challenges.},
author = {{Baumgartner, Nina}},
year = {2022},
langid = {english}
}
```
### Contributions
Thanks to [@ninabaumgartner](https://github.com/ninabaumgartner) for adding this dataset.
|
OpenGVLab/Caption-Evaluation-Data | ---
license: apache-2.0
---
This repository presents the evaluation data used for [ASM](https://github.com/Weiyun1025/all-seeing-official/tree/main/all-seeing). Please refer to [this document](https://github.com/OpenGVLab/all-seeing/tree/main/all-seeing#testing) for more details about the repository.
|
HumanCompatibleAI/random-seals-Walker2d-v1 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float32
splits:
- name: train
num_bytes: 81303050
num_examples: 100
download_size: 41495120
dataset_size: 81303050
---
# Dataset Card for "random-seals-Walker2d-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_50_1713123422 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 244794
num_examples: 572
download_size: 121577
dataset_size: 244794
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Oztobuzz/gsm8k_0.1_42 | ---
license: mit
---
|
rkorez/medix-codegen-v2 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 206046946
num_examples: 30000
download_size: 61887714
dataset_size: 206046946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
QNN/autotrain-data-pubmed | ---
task_categories:
- token-classification
---
# AutoTrain Dataset for project: pubmed
## Dataset Description
This dataset has been automatically processed by AutoTrain for project pubmed.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tokens": [
"Pd",
"has",
"been",
"regarded",
"as",
"one",
"of",
"the",
"alternatives",
"to",
"Pt",
"as",
"a",
"promising",
"hydrogen",
"evolution",
"reaction",
"(HER)",
"catalyst.",
"Strategies",
"including",
"Pd-metal",
"alloys",
"(Pd-M)",
"and",
"Pd",
"hydrides",
"(PdH<sub><i>x</i></sub>)",
"have",
"been",
"proposed",
"to",
"boost",
"HER",
"performances.",
"However,",
"the",
"stability",
"issues,",
"e.g.,",
"the",
"dissolution",
"in",
"Pd-M",
"and",
"the",
"hydrogen",
"releasing",
"in",
"PdH<sub><i>x</i></sub>,",
"restrict",
"the",
"industrial",
"application",
"of",
"Pd-based",
"HER",
"catalysts.",
"We",
"here",
"design",
"and",
"synthesize",
"a",
"stable",
"Pd-Cu",
"hydride",
"(",
"PdCu<sub>0.2</sub>H<sub>0.43</sub>",
")",
"catalyst,",
"combining",
"the",
"advantages",
"of",
"both",
"Pd-M",
"and",
"PdH<sub><i>x</i></sub>",
"structures",
"and",
"improving",
"the",
"HER",
"durability",
"simultaneously.",
"The",
"hydrogen",
"intercalation",
"is",
"realized",
"under",
"atmospheric",
"pressure",
"(1.0",
"atm)",
"following",
"our",
"synthetic",
"approach",
"that",
"imparts",
"high",
"stability",
"to",
"the",
"Pd-Cu",
"hydride",
"structure.",
"The",
"obtained",
"PdCu<sub>0.2</sub>H<sub>0.43</sub>",
"catalyst",
"exhibits",
"a",
"small",
"overpotential",
"of",
"28",
"mV",
"at",
"10",
"mA/cm<sup>2</sup>",
",",
"a",
"low",
"Tafel",
"slope",
"of",
"23",
"mV/dec",
",",
"and",
"excellent",
"HER",
"durability",
"due",
"to",
"its",
"appropriate",
"hydrogen",
"adsorption",
"free",
"energy",
"and",
"alleviated",
"metal",
"dissolution",
"rate.",
"</p>",
"<p>"
],
"tags": [
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
2,
2,
2,
2,
4,
2,
5,
5,
2,
5,
5,
2,
2,
2,
4,
2,
2,
5,
5,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2
]
},
{
"tokens": [
"A",
"critical",
"challenge",
"in",
"energy",
"research",
"is",
"the",
"development",
"of",
"earth",
"abundant",
"and",
"cost-effective",
"materials",
"that",
"catalyze",
"the",
"electrochemical",
"splitting",
"of",
"water",
"into",
"hydrogen",
"and",
"oxygen",
"at",
"high",
"rates",
"and",
"low",
"overpotentials.",
"Key",
"to",
"addressing",
"this",
"issue",
"lies",
"not",
"only",
"in",
"the",
"synthesis",
"of",
"new",
"materials,",
"but",
"also",
"in",
"the",
"elucidation",
"of",
"their",
"active",
"sites,",
"their",
"structure",
"under",
"operating",
"conditions",
"and",
"ultimately,",
"extraction",
"of",
"the",
"structure-function",
"relationships",
"used",
"to",
"spearhead",
"the",
"next",
"generation",
"of",
"catalyst",
"development.",
"In",
"this",
"work,",
"we",
"present",
"a",
"complete",
"cycle",
"of",
"synthesis,",
"operando",
"characterization,",
"and",
"redesign",
"of",
"an",
"amorphous",
"cobalt",
"phosphide",
"(",
"CoP",
"<sub><i>x</i></sub>",
")",
"bifunctional",
"catalyst.",
"The",
"research",
"was",
"driven",
"by",
"integrated",
"electrochemical",
"analysis,",
"Raman",
"spectroscopy",
"and",
"gravimetric",
"measurements",
"utilizing",
"a",
"novel",
"quartz",
"crystal",
"microbalance",
"spectroelectrochemical",
"cell",
"to",
"uncover",
"the",
"catalytically",
"active",
"species",
"of",
"amorphous",
"CoP",
"<sub><i>x</i></sub>",
"and",
"subsequently",
"modify",
"the",
"material",
"to",
"enhance",
"the",
"activity",
"of",
"the",
"elucidated",
"catalytic",
"phases.",
"Illustrating",
"the",
"power",
"of",
"our",
"approach,",
"the",
"second",
"generation",
"cobalt-iron",
"phosphide",
"(",
"CoFeP<sub>x</sub>",
")",
"catalyst,",
"developed",
"through",
"an",
"iteration",
"of",
"the",
"operando",
"measurement",
"directed",
"optimization",
"cycle,",
"is",
"superior",
"in",
"both",
"hydrogen",
"and",
"oxygen",
"evolution",
"reactivity",
"over",
"the",
"previous",
"material",
"and",
"is",
"capable",
"of",
"overall",
"water",
"electrolysis",
"at",
"a",
"current",
"density",
"of",
"10",
"mA",
"cm<sup>-2</sup>",
"with",
"1.5",
"V",
"applied",
"bias",
"in",
"1",
"M",
"KOH",
"electrolyte",
"solution.",
"</p>",
"<p>"
],
"tags": [
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
4,
4,
2,
5,
5,
5,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(names=['CATALYST', 'CO-CATALYST', 'O', 'Other', 'PROPERTY_NAME', 'PROPERTY_VALUE'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 166 |
| valid | 44 |
|
ssahir/REPV | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: path
dtype: string
- name: file
dtype: string
- name: gender
dtype: string
- name: emotion
dtype: string
- name: speech
sequence: float32
splits:
- name: train
num_bytes: 380197186
num_examples: 1628
- name: test
num_bytes: 92682047
num_examples: 407
download_size: 0
dataset_size: 472879233
---
# Dataset Card for "REPV"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
felixsaaro/test | ---
dataset_info:
features:
- name: image
dtype: image
- name: name
dtype: string
- name: frameType
dtype: string
splits:
- name: train
num_bytes: 660338115.545
num_examples: 12405
download_size: 656146541
dataset_size: 660338115.545
license: apache-2.0
task_categories:
- text-to-image
language:
- en
pretty_name: Yugio Cards
size_categories:
- 1K<n<10K
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ISTNetworks/open-orca-arabic | ---
license: other
license_name: ist
license_link: LICENSE
---
|
AlekseyKorshuk/full_user_edit_responses-clean | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: model_input
dtype: string
- name: response
dtype: string
- name: edited_response
dtype: string
- name: user_id
dtype: string
splits:
- name: train
num_bytes: 782450796.338342
num_examples: 364272
download_size: 355765300
dataset_size: 782450796.338342
---
# Dataset Card for "full_user_edit_responses-clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713170079 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 431921
num_examples: 996
download_size: 201715
dataset_size: 431921
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ranimeree/NewDataSetMixed | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 395458124.688
num_examples: 2769
- name: validation
num_bytes: 61731350.0
num_examples: 352
- name: test
num_bytes: 12103649.0
num_examples: 101
download_size: 452227024
dataset_size: 469293123.688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
genvision-ai/test_2024_02_20 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: segments_info
list:
- name: area
dtype: float64
- name: bbox
sequence: float64
- name: category_id
dtype: int64
- name: id
dtype: int64
- name: image_id
dtype: int64
- name: iscrowd
dtype: int64
- name: segmentation
sequence:
sequence: float64
- name: image_name
dtype: string
splits:
- name: train
num_bytes: 2721373626.738
num_examples: 12039
download_size: 2737790368
dataset_size: 2721373626.738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lolechka/guanaco-llama2-ru-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 166211.8171846435
num_examples: 788
- name: test
num_bytes: 9932.181467181466
num_examples: 46
download_size: 1056715
dataset_size: 176143.99865182498
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
nimrita/BhagwadGitaChapter2Embeddings | ---
license: afl-3.0
---
This file contains embeddings of English text of Bhagwad Gita Chapter 2 from this site (https://vedabase.io/en/library/bg/2/)
generated using sentence-transformer (sentence-transformers/all-MiniLM-L6-v2) at Hugging Face.
|
retkowski/ytseg | ---
license: cc-by-nc-sa-4.0
language:
- en
tags:
- text segmentation
- smart chaptering
- segmentation
- youtube
- asr
pretty_name: YTSeg
size_categories:
- 10K<n<100K
task_categories:
- token-classification
- automatic-speech-recognition
---
# From Text Segmentation to Smart Chaptering: A Novel Benchmark for Structuring Video Transcriptions
We present <span style="font-variant:small-caps; font-weight:700;">YTSeg</span>, a topically and structurally diverse benchmark for the text segmentation task based on YouTube transcriptions. The dataset comprises 19,299 videos from 393 channels, amounting to 6,533 content hours. The topics are wide-ranging, covering domains such as science, lifestyle, politics, health, economy, and technology. The videos are from various types of content formats, such as podcasts, lectures, news, corporate events \& promotional content, and, more broadly, videos from individual content creators. We refer to the **paper** ([acl](https://aclanthology.org/2024.eacl-long.25/) | [arXiv](https://arxiv.org/abs/2402.17633)) for further information. We provide both text and audio data as well as a download script for the video data.
## Data Overview
### <span style="font-variant:small-caps;">YTSeg</span>
Each video is represented as a JSON object with the following fields:
| Field | Description |
|--------------|------------------------------------------------|
| `text` | A flat list of sentences. |
| `targets` | The target segmentation as string of binary values (e.g., `000100000010`). |
| `channel_id` | The YouTube channel ID which this video belongs to. |
| `video_id` | The YouTube video ID. |
| `audio_path` | Path to the .mp3 file of the video. |
| Partition | # Examples |
|------------|--------------|
| Training | 16,404 (85%) |
| Validation | 1,447 (7.5%) |
| Testing | 1,448 (7.5%) |
| Total | 19,229 |
### <span style="font-variant:small-caps;">YTSeg[Titles]</span>
Each chapter of a video is represented as a JSON object with the following fields:
| Field | Description |
|--------------|------------------------------------------------|
| `input` | The complete chapter/section text. |
| `input_with_chapters` | The complete chapter/section text with previous section titles prepended. |
| `target` | The target chapter title. |
| `channel_id` | The YouTube channel ID which this chapter's video belongs to. |
| `video_id` | The YouTube video ID which this chapter belongs to. |
| `chapter_idx` | The index and placement of the chapter in the video (e.g., the first chapter has index `0`). |
| Partition | # Examples |
|------------|--------------|
| Training | 146,907 (84.8%)|
| Validation | 13,206 (7.6%) |
| Testing | 13,082 (7.6%) |
| Total | 173,195 |
### Audio Data
We provide audio files for all examples in the dataset, preprocessed into the .mp3 format with a standardized sample rate of 16,000 Hz and a single channel (mono). These files are organized within the directory structure as follows: `data/audio/<channel_id>/<video_id>.mp3`.
### Video Data
A download script for the video and audio data is provided.
```py
python download_videos.py
```
In the script, you can further specify a target folder (default is `./video`) and target formats in a priority list.
## Loading Text Data
This repository comes with a simple, exemplary script to read in the text data with `pandas`.
```py
from load_data import get_partition
test_data = get_partition('test')
```
Equivalently, to read in <span style="font-variant:small-caps;">YTSeg[Titles]</span>:
```py
from load_data import get_title_partition
test_data = get_title_partition('test')
```
## Citing
We kindly request you to cite our corresponding EACL 2024 paper if you use our dataset.
```
@inproceedings{retkowski-waibel-2024-text,
title = "From Text Segmentation to Smart Chaptering: A Novel Benchmark for Structuring Video Transcriptions",
author = "Retkowski, Fabian and Waibel, Alexander",
editor = "Graham, Yvette and Purver, Matthew",
booktitle = "Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = mar,
year = "2024",
address = "St. Julian{'}s, Malta",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.eacl-long.25",
pages = "406--419",
abstract = "Text segmentation is a fundamental task in natural language processing, where documents are split into contiguous sections. However, prior research in this area has been constrained by limited datasets, which are either small in scale, synthesized, or only contain well-structured documents. In this paper, we address these limitations by introducing a novel benchmark YTSeg focusing on spoken content that is inherently more unstructured and both topically and structurally diverse. As part of this work, we introduce an efficient hierarchical segmentation model MiniSeg, that outperforms state-of-the-art baselines. Lastly, we expand the notion of text segmentation to a more practical {``}smart chaptering{''} task that involves the segmentation of unstructured content, the generation of meaningful segment titles, and a potential real-time application of the models.",
}
```
## License
The dataset is available under the **Creative Commons Attribution-NonCommercial-ShareAlike (CC BY-NC-SA) 4.0** license. We note that we do not own the copyright of the videos and as such opted to release the dataset with a non-commercial license, with the intended use to be in research and education. |
Multimodal-Fatima/VQAv2_test_no_image_split_3 | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: test
num_bytes: 2150238682
num_examples: 44779
download_size: 540758200
dataset_size: 2150238682
---
# Dataset Card for "VQAv2_test_no_image_split_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-glue-mnli-026a6e-14686017 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: Jiva/xlm-roberta-large-it-mnli
metrics: []
dataset_name: glue
dataset_config: mnli
dataset_split: validation_matched
col_mapping:
text1: premise
text2: hypothesis
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: Jiva/xlm-roberta-large-it-mnli
* Dataset: glue
* Config: mnli
* Split: validation_matched
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Brendan/BabyLMDemo | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2220547
num_examples: 80000
download_size: 0
dataset_size: 2220547
---
# Dataset Card for "BabyLMDemo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cleudemir/basedevoz2 | ---
license: openrail
---
|
yazan-bawab/ps-llm | ---
license: mit
---
|
tmnam20/ViPubMed | ---
license: cc
language:
- vi
- en
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: pubmed
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: pubmed22
num_bytes: 44360028980
num_examples: 20087006
download_size: 23041004247
dataset_size: 44360028980
---
# ALERT: This dataset repo is duplicated from [VietAI/vi_pubmed](https://huggingface.co/datasets/VietAI/vi_pubmed)
The reason to have this duplicated repo is to avoid the lost/corruption of the original repo when I am doing some stuff ^^.
# Dataset Summary
20M Vietnamese PubMed biomedical abstracts translated by the [state-of-the-art English-Vietnamese Translation project](https://arxiv.org/abs/2210.05610). The data has been used as unlabeled dataset for [pretraining a Vietnamese Biomedical-domain Transformer model](https://arxiv.org/abs/2210.05598).

image source: [Enriching Biomedical Knowledge for Vietnamese Low-resource Language Through Large-Scale Translation](https://arxiv.org/abs/2210.05598)
# Language
- English: Original biomedical abstracts from [Pubmed](https://www.nlm.nih.gov/databases/download/pubmed_medline_faq.html)
- Vietnamese: Synthetic abstract translated by a [state-of-the-art English-Vietnamese Translation project](https://arxiv.org/abs/2210.05610)
# Dataset Structure
- The English sequences are
- The Vietnamese sequences are
# Source Data - Initial Data Collection and Normalization
https://www.nlm.nih.gov/databases/download/pubmed_medline_faq.html
# Licensing Information
[Courtesy of the U.S. National Library of Medicine.](https://www.nlm.nih.gov/databases/download/terms_and_conditions.html)
# Citation
```
@misc{mtet,
doi = {10.48550/ARXIV.2210.05610},
url = {https://arxiv.org/abs/2210.05610},
author = {Ngo, Chinh and Trinh, Trieu H. and Phan, Long and Tran, Hieu and Dang, Tai and Nguyen, Hieu and Nguyen, Minh and Luong, Minh-Thang},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {MTet: Multi-domain Translation for English and Vietnamese},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
```
@misc{vipubmed,
doi = {10.48550/ARXIV.2210.05598},
url = {https://arxiv.org/abs/2210.05598},
author = {Phan, Long and Dang, Tai and Tran, Hieu and Phan, Vy and Chau, Lam D. and Trinh, Trieu H.},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Enriching Biomedical Knowledge for Vietnamese Low-resource Language Through Large-Scale Translation},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` |
cbt | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- en
license:
- gfdl
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- n<1K
source_datasets:
- original
task_categories:
- other
- question-answering
task_ids:
- multiple-choice-qa
paperswithcode_id: cbt
pretty_name: Childrenโs Book Test (CBT)
config_names:
- CN
- NE
- P
- V
- raw
dataset_info:
- config_name: CN
features:
- name: sentences
sequence: string
- name: question
dtype: string
- name: answer
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 301730151
num_examples: 120769
- name: test
num_bytes: 6138376
num_examples: 2500
- name: validation
num_bytes: 4737257
num_examples: 2000
download_size: 31615166
dataset_size: 312605784
- config_name: NE
features:
- name: sentences
sequence: string
- name: question
dtype: string
- name: answer
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 253551931
num_examples: 108719
- name: test
num_bytes: 5707734
num_examples: 2500
- name: validation
num_bytes: 4424316
num_examples: 2000
download_size: 29693075
dataset_size: 263683981
- config_name: P
features:
- name: sentences
sequence: string
- name: question
dtype: string
- name: answer
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 852852601
num_examples: 334030
- name: test
num_bytes: 6078048
num_examples: 2500
- name: validation
num_bytes: 4776981
num_examples: 2000
download_size: 43825356
dataset_size: 863707630
- config_name: V
features:
- name: sentences
sequence: string
- name: question
dtype: string
- name: answer
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 252177649
num_examples: 105825
- name: test
num_bytes: 5806625
num_examples: 2500
- name: validation
num_bytes: 4556425
num_examples: 2000
download_size: 29992082
dataset_size: 262540699
- config_name: raw
features:
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 25741580
num_examples: 98
- name: test
num_bytes: 1528704
num_examples: 5
- name: validation
num_bytes: 1182657
num_examples: 5
download_size: 16350790
dataset_size: 28452941
configs:
- config_name: CN
data_files:
- split: train
path: CN/train-*
- split: test
path: CN/test-*
- split: validation
path: CN/validation-*
- config_name: NE
data_files:
- split: train
path: NE/train-*
- split: test
path: NE/test-*
- split: validation
path: NE/validation-*
- config_name: P
data_files:
- split: train
path: P/train-*
- split: test
path: P/test-*
- split: validation
path: P/validation-*
- config_name: V
data_files:
- split: train
path: V/train-*
- split: test
path: V/test-*
- split: validation
path: V/validation-*
- config_name: raw
data_files:
- split: train
path: raw/train-*
- split: test
path: raw/test-*
- split: validation
path: raw/validation-*
---
# Dataset Card for CBT
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[The bAbI project](https://research.fb.com/downloads/babi/)
- **Repository:**
- **Paper:** [arXiv Paper](https://arxiv.org/pdf/1511.02301.pdf)
- **Leaderboard:**
- **Point of Contact:** [Felix Hill](mailto:felix.hill@cl.cam.ac.uk) or [Antoine Bordes](mailto:abordes@fb.com).
### Dataset Summary
The Childrenโs Book Test (CBT) is designed to measure directly how well language models can exploit wider linguistic context. The CBT is built from books that are freely available.
This dataset contains four different configurations:
- `V`: where the answers to the questions are verbs.
- `P`: where the answers to the questions are pronouns.
- `NE`: where the answers to the questions are named entities.
- `CN`: where the answers to the questions are common nouns.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The data is present in English language as written by authors Lucy Maud Montgomery, Charles Dickens,Andrew Lang, etc. in story books for children.
## Dataset Structure
### Data Instances
An instance from the `V` config:
```
{'answer': 'said', 'options': ['christening', 'existed', 'hear', 'knows', 'read', 'remarked', 'said', 'sitting', 'talking', 'wearing'], 'question': "`` They are very kind old ladies in their way , '' XXXXX the king ; `` and were nice to me when I was a boy . ''", 'sentences': ['This vexed the king even more than the queen , who was very clever and learned , and who had hated dolls when she was a child .', 'However , she , too in spite of all the books she read and all the pictures she painted , would have been glad enough to be the mother of a little prince .', 'The king was anxious to consult the fairies , but the queen would not hear of such a thing .', 'She did not believe in fairies : she said that they had never existed ; and that she maintained , though The History of the Royal Family was full of chapters about nothing else .', 'Well , at long and at last they had a little boy , who was generally regarded as the finest baby that had ever been seen .', 'Even her majesty herself remarked that , though she could never believe all the courtiers told her , yet he certainly was a fine child -- a very fine child .', 'Now , the time drew near for the christening party , and the king and queen were sitting at breakfast in their summer parlour talking over it .', 'It was a splendid room , hung with portraits of the royal ancestors .', 'There was Cinderella , the grandmother of the reigning monarch , with her little foot in her glass slipper thrust out before her .', 'There was the Marquis de Carabas , who , as everyone knows , was raised to the throne as prince consort after his marriage with the daughter of the king of the period .', 'On the arm of the throne was seated his celebrated cat , wearing boots .', 'There , too , was a portrait of a beautiful lady , sound asleep : this was Madame La Belle au Bois-dormant , also an ancestress of the royal family .', 'Many other pictures of celebrated persons were hanging on the walls .', "`` You have asked all the right people , my dear ? ''", 'said the king .', "`` Everyone who should be asked , '' answered the queen .", "`` People are so touchy on these occasions , '' said his majesty .", "`` You have not forgotten any of our aunts ? ''", "`` No ; the old cats ! ''", "replied the queen ; for the king 's aunts were old-fashioned , and did not approve of her , and she knew it ."]}
```
### Data Fields
For the `raw` config, the data fields are:
- `title`: a `string` feature containing the title of the book present in the dataset.
- `content`: a `string` feature containing the content of the book present in the dataset.
For all other configs, the data fields are:
- `sentences`: a `list` of `string` features containing 20 sentences from a book.
- `question`: a `string` feature containing a question with blank marked as `XXXX` which is to be filled with one of the options.
- `answer`: a `string` feature containing the answer.
- `options`: a `list` of `string` features containing the options for the question.
### Data Splits
The splits and corresponding sizes are:
| |train |test |validation|
|:--|------:|----:|---------:|
|raw|98 |5 |5 |
|V |105825 |2500 |2000 |
|P |334030 |2500 |2000 |
|CN |120769 |2500 |2000 |
|NE |108719 |2500 |2000 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
Children's Book Authors
### Annotations
#### Annotation process
From the [homepage](https://research.fb.com/downloads/babi/):
>After allocating books to either training, validation or test sets, we formed example โquestionsโ from chapters in the book by enumerating 21 consecutive sentences. In each question, the first 20 sentences form the context, and a word is removed from the 21st sentence, which becomes the query. Models must identify the answer word among a selection of 10 candidate answers appearing in the context sentences and the query. For finer-grained analyses, we evaluated four classes of question by removing distinct types of word: Named Entities, (Common) Nouns, Verbs and Prepositions.
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
```
GNU Free Documentation License v1.3
```
### Citation Information
```
@misc{hill2016goldilocks,
title={The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations},
author={Felix Hill and Antoine Bordes and Sumit Chopra and Jason Weston},
year={2016},
eprint={1511.02301},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@gchhablani](https://github.com/gchhablani) for adding this dataset. |
marhensa/comfyui-workflow | ---
language:
- en
tags:
- comfyui
- template
- workflow
---
# HOW TO USE:
Drag and Drop Template.JSON / Template.PNG into [ComfyUI in browser](https://github.com/comfyanonymous/ComfyUI) to load the template! (Yes even output PNG file works as workflow template).
# PREQUISITE INSTALLATION NOTES:
It requires custom nodes and file requirements, it's easy:
1. ["ComfyUI Manager"](https://github.com/ltdrdata/ComfyUI-Manager#installation) (install it first).
2. Open ComfyUI WebUI > Manager > Install Custom Nodes, search "ComfyUI-Impact-Pack", "SDXL Prompt Sytler", and "Recommended SDXL Resolution Calculator", "Masquerade Nodes", Install ALL, close the ComfyUI console command prompt.
3. "Recommended SD 1.5 Resolution Calculator", open command prompt from this folder \ComfyUI\custom_nodes\ then use this command "git clone https://github.com/marhensa/sd15-recommended-res-calc.git")
4. "ComfyUI ControlNet Aux" custom node can be installed with [this instruction](https://github.com/Fannovel16/comfyui_controlnet_aux#installation), and the models can be downloaded [from here](https://huggingface.co/lllyasviel/sd_control_collection/tree/main), put into \ComfyUI\models\controlnet\
5. LCM LoRA, (needs a LCM LoRA, download it from here for [LCM-SDXL](https://huggingface.co/latent-consistency/lcm-lora-sdxl/resolve/main/pytorch_lora_weights.safetensors?download=true), for [LCM-SDXL-SSD1B](https://huggingface.co/latent-consistency/lcm-lora-ssd-1b/resolve/main/pytorch_lora_weights.safetensors?download=true), and here for [LCM-SD1.5](https://huggingface.co/latent-consistency/lcm-lora-sdv1-5/resolve/main/pytorch_lora_weights.safetensors?download=true)), for LCM-SDXL put into \ComfyUI\models\loras\LCM\SDXL\, LCM-SDXL-SSD1B put into \ComfyUI\models\loras\LCM\SDXL-SSD1B\, LCM-SD1.5 put into \ComfyUI\models\loras\LCM\SD15\ .
6. LCM-Turbo, download from [here](https://civitai.com/api/download/models/246747?type=Model&format=SafeTensor) for SDXL based models, and then put into put into \ComfyUI\models\loras\SDXL\LCMTurboMix_Euler_A_fix.safetensors
7. Another LoRA, download additionals LoRA from CivitAI or select "None" to not use it.
8. VAE selector, (needs a VAE file, download SDXL BF16 VAE from [here](https://huggingface.co/madebyollin/sdxl-vae-fp16-fix/resolve/main/sdxl_vae.safetensors), and VAE file for SD 1.5 from [here](https://huggingface.co/stabilityai/sd-vae-ft-mse-original/resolve/main/vae-ft-mse-840000-ema-pruned.safetensors?download=true). Put into \ComfyUI\models\vae\SDXL\ and \ComfyUI\models\vae\SD15\).
9. Upscale model, (needs to be downloaded into \ComfyUI\models\upscale_models\ Recommended one is 4x-UltraSharp, download [from here](https://mega.nz/folder/qZRBmaIY#nIG8KyWFcGNTuMX_XNbJ_g). All the list of Upscale model is [here](https://upscale.wiki/w/index.php?title=Model_Database&oldid=1571))
10. Checkpoints. SDXL-SSD1B can be downloaded from [here](https://huggingface.co/segmind/SSD-1B/resolve/main/SSD-1B.safetensors?download=true), my recommended Checkpoint for SDXL is [AlbedoBase-XL](https://civitai.com/models/140737), [Jib-Mix-Realistic-XL](https://civitai.com/models/194768), [Crystal-Clear-XL](https://civitai.com/models/122822), and for SD1.5 is [Haveall](https://civitai.com/models/213692), download Safetensors file and put into \ComfyUI\models\checkpoints\SDXL\ and \ComfyUI\models\checkpoints\SD15\
11. Note: When loading a PNG Workflow from here, first click Refresh on ComfyUI menu, it will refresh models that you have on your PC, then choose it (Checkpoints, VAE, LoRA, Upscale Model, ControlNet models etc) accordingly.
.
> Custom Node "[Recommended Resolution Calculator](https://github.com/marhensa/sdxl-recommended-res-calc)", makes targeting final resolution much easier, because it will calculates recommended SD initial size and its upscale (or reverse upscale) value.
You only need to input your FINAL target resolution with this custom node.
.
# WORKFLOW SELECTION:
(drag and drop that PNG image file into ComfyUI interface, it will open up as workflow template)
## RECOMMENDED! FAST!! TIDY - Single SDXL Checkpoint Workflow (LCM-Turbo, PromptStyler, Upscale Model Switch, ControlNet, FaceDetailer) :

(ControlNet image reference example: [halo.jpg](https://huggingface.co/datasets/marhensa/comfyui-workflow/resolve/main/halo.jpg))
.
## TIDY - Single SD 1.5 Checkpoint Workflow (LCM, PromptStyler, Upscale Model Switch, ControlNet, FaceDetailer) :

(ControlNet image reference example: [halo.jpg](https://huggingface.co/datasets/marhensa/comfyui-workflow/resolve/main/halo.jpg))
.
## Breakdown - SDXL Workflow (LCM-Turbo, PromptStyler, Upscale Model Switch) :

.
## Breakdown - SDXL-SSD1B Workflow (LCM, PromptStyler, Upscale Model Switch) :

.
## Breakdown - SD1.5 Workflow (LCM, PromptStyler, Upscale Model Switch) :

.
## Simple without install anything - SDXL / SD15 LCM Workflow :

|
hustzx/sd_test | ---
license: apache-2.0
---
|
detakarang/know_sql_alpaca | ---
license: openrail
---
|
youngdicey/rico-raw | ---
license: openrail
---
|
gulyaabdullaeva/DS_test | ---
license: unknown
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.