datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Red-8/NER_Gujarati_data | ---
task_categories:
- token-classification
language:
- gu
tags:
- Months
- Days
- Seasons
- Time
- Date
- Year
- Ordinals
- Number
- Percentage
- Quantity
pretty_name: Gujarati_Data
size_categories:
- n<1K
--- |
MITCriticalData/SAT4_dataset_10_best_cities_augmented_v1 | ---
license: mit
---
|
fredguth/aisegmentcn-matting-human | ---
annotations_creators:
- Beijing Wanxing Convergence Technology Co
license:
- mit
pretty_name: aisegmentcn-matting-human
size_categories:
- 10K<n<100K
tags:
- binary
- aisegment.cn
task_categories:
- image-segmentation
task_ids:
- semantic-segmentation
---
# Dataset Card for AISegment.cn - Matting Human datasets
## Table of Contents
- [Dataset Card for AISegment.cn - Matting Human datasets](#dataset-card-for-aisegmentcn---matting-human-datasets)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Structure](#dataset-structure)
- [Licensing Information](#licensing-information)
## Dataset Description
Quoting the [dataset's github](https://github.com/aisegmentcn/matting_human_datasets) (translated by Apple Translator):
> This dataset is currently the largest portrait matting dataset, containing 34,427 images and corresponding matting results.
> The data set was marked by the high quality of Beijing Play Star Convergence Technology Co. Ltd., and the portrait soft segmentation model trained using this data set has been commercialized.
> The original images in the dataset are from `Flickr`, `Baidu`, and `Taobao`. After face detection and area cropping, a half-length portrait of 600\*800 was generated.
> The clip_img directory is a half-length portrait image in the format jpg; the matting directory is the corresponding matting file (convenient to confirm the matting quality), the format is png, you should first extract the alpha map from the png image before training.
- **Repository:** [aisegmentcn/matting_human_datasets](https://github.com/aisegmentcn/matting_human_datasets)
## Dataset Structure
```text
└── data/
├── clip_img/
│ └── {group-id}/
│ └── clip_{subgroup-id}/
│ └── {group-id}-{img-id}.jpg
└── matting/
└── {group-id}/
└── matting_{subgroup-id}/
└── {group-id}-{img-id}.png
```
The input `data/clip_img/1803151818/clip_00000000/1803151818-00000003.jpg` matches the label `data/matting/1803151818/matting_00000000/1803151818-00000003.png`
### Licensing Information
See authors [Github](https://github.com/aisegmentcn/matting_human_datasets)
|
Ayush2609/AJ_sentence | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 249843.62830074583
num_examples: 4464
- name: validation
num_bytes: 27816.37169925418
num_examples: 497
download_size: 179173
dataset_size: 277660.0
---
# Dataset Card for "AJ_sentence"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/nepenee_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nepenee/ネフェニー (Fire Emblem)
This is the dataset of nepenee/ネフェニー (Fire Emblem), containing 189 images and their tags.
The core tags of this character are `green_hair, long_hair, green_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 189 | 199.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 189 | 132.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 385 | 240.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 189 | 185.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 385 | 312.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nepenee_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blue_armor, breastplate, helmet, solo, spear, thighhighs, blue_eyes, boots, skirt, full_body, holding_weapon, shield, simple_background, belt, white_background, detached_sleeves, looking_at_viewer |
| 1 | 5 |  |  |  |  |  | 1girl, blue_armor, helmet, solo, breastplate, spear, shield, thighhighs, belt |
| 2 | 7 |  |  |  |  |  | 1girl, hetero, solo_focus, vaginal, blush, nipples, rape, armor, helmet, multiple_penises, cum_in_pussy, large_breasts, mosaic_censoring, spread_legs, tears, thighhighs, torn_clothes, 3boys, gangbang, medium_breasts, mmf_threesome, straddling |
| 3 | 7 |  |  |  |  |  | 1boy, 1girl, helmet, hetero, day, large_breasts, nipples, open_mouth, blush, cum_in_pussy, penis, solo_focus, vaginal, bar_censor, blue_armor, breasts_out, clothed_sex, overflow, very_long_hair, anus, blue_sky, outdoors |
| 4 | 6 |  |  |  |  |  | 1girl, day, looking_at_viewer, outdoors, solo, cloud, large_breasts, navel, black_bikini, blue_sky, blush, cleavage, helmet, ocean |
| 5 | 5 |  |  |  |  |  | 1girl, blue_dress, collarbone, helmet, long_sleeves, medium_breasts, solo, veil, wide_sleeves, aqua_eyes, bangs, blue_footwear, full_body, gradient_hair, puffy_sleeves, simple_background, frilled_sleeves, looking_at_viewer, shoes, smile, white_background, arrow_(projectile), bare_shoulders, blue_armor, closed_mouth, detached_sleeves, holding_bow_(weapon), looking_away, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_armor | breastplate | helmet | solo | spear | thighhighs | blue_eyes | boots | skirt | full_body | holding_weapon | shield | simple_background | belt | white_background | detached_sleeves | looking_at_viewer | hetero | solo_focus | vaginal | blush | nipples | rape | armor | multiple_penises | cum_in_pussy | large_breasts | mosaic_censoring | spread_legs | tears | torn_clothes | 3boys | gangbang | medium_breasts | mmf_threesome | straddling | 1boy | day | open_mouth | penis | bar_censor | breasts_out | clothed_sex | overflow | very_long_hair | anus | blue_sky | outdoors | cloud | navel | black_bikini | cleavage | ocean | blue_dress | collarbone | long_sleeves | veil | wide_sleeves | aqua_eyes | bangs | blue_footwear | gradient_hair | puffy_sleeves | frilled_sleeves | shoes | smile | arrow_(projectile) | bare_shoulders | closed_mouth | holding_bow_(weapon) | looking_away | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------|:---------|:-------|:--------|:-------------|:------------|:--------|:--------|:------------|:-----------------|:---------|:--------------------|:-------|:-------------------|:-------------------|:--------------------|:---------|:-------------|:----------|:--------|:----------|:-------|:--------|:-------------------|:---------------|:----------------|:-------------------|:--------------|:--------|:---------------|:--------|:-----------|:-----------------|:----------------|:-------------|:-------|:------|:-------------|:--------|:-------------|:--------------|:--------------|:-----------|:-----------------|:-------|:-----------|:-----------|:--------|:--------|:---------------|:-----------|:--------|:-------------|:-------------|:---------------|:-------|:---------------|:------------|:--------|:----------------|:----------------|:----------------|:------------------|:--------|:--------|:---------------------|:-----------------|:---------------|:-----------------------|:---------------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | X | X | | | | | | | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | X | X | | | | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
WESLEY453/mcpozedataset2 | ---
license: openrail
---
|
Navvye/TrialTSV | ---
license: mit
---
|
mtkinit/short_slovak_sentiment | ---
pretty_name: short-slovak-sentiment
---
# short-slovak-sentiment
Created from AIOD platform |
davanstrien/autotrain-data-newspaper-type-clean | Invalid username or password. |
jeapaul/english_europarl_bilingual_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 281100121
num_examples: 1892723
download_size: 155904108
dataset_size: 281100121
---
# Dataset Card for "english_europarl_bilingual_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/nitocris_alter_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nitocris_alter/ニトクリス〔オルタ〕/尼托克丽丝〔Alter〕 (Fate/Grand Order)
This is the dataset of nitocris_alter/ニトクリス〔オルタ〕/尼托克丽丝〔Alter〕 (Fate/Grand Order), containing 144 images and their tags.
The core tags of this character are `animal_ears, dark-skinned_female, dark_skin, facial_mark, jackal_ears, white_hair, yellow_eyes, multicolored_hair, hairband, sidelocks, breasts, streaked_hair, antenna_hair, colored_inner_hair, hoop_earrings, earrings, black_hair, medium_breasts, short_hair, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 144 | 287.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_alter_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 144 | 242.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_alter_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 348 | 469.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_alter_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nitocris_alter_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 31 |  |  |  |  |  | 1girl, facepaint, jewelry, solo, usekh_collar, looking_at_viewer, ankh, bandages, very_long_hair |
| 1 | 35 |  |  |  |  |  | 1girl, ankh, facepaint, solo, usekh_collar, armlet, looking_at_viewer, red_cape, bandages, gold_trim, bracelet, bracer, belly_chain, pelvic_curtain |
| 2 | 5 |  |  |  |  |  | 1girl, ankh, facepaint, fire, looking_at_viewer, solo, usekh_collar, armlet, belly_chain, necklace, underboob, thighs |
| 3 | 10 |  |  |  |  |  | 1girl, facepaint, looking_at_viewer, solo, thighs, collarbone, navel, nude, convenient_censoring, fire, large_breasts, jewelry |
| 4 | 5 |  |  |  |  |  | 1girl, armlet, bare_shoulders, bracelet, facepaint, navel, usekh_collar, belly_chain, blush, large_breasts, looking_at_viewer, smile, thighs, center_opening, solo, white_dress, 1boy, ankh |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, facepaint, hetero, looking_at_viewer, open_mouth, penis, cum_in_pussy, mosaic_censoring, navel, nipples, sex, thighs, vaginal, bandages, collarbone, cowgirl_position, girl_on_top, jewelry, large_breasts, solo_focus, spread_legs, sweat, completely_nude, pov, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | facepaint | jewelry | solo | usekh_collar | looking_at_viewer | ankh | bandages | very_long_hair | armlet | red_cape | gold_trim | bracelet | bracer | belly_chain | pelvic_curtain | fire | necklace | underboob | thighs | collarbone | navel | nude | convenient_censoring | large_breasts | bare_shoulders | blush | smile | center_opening | white_dress | 1boy | hetero | open_mouth | penis | cum_in_pussy | mosaic_censoring | nipples | sex | vaginal | cowgirl_position | girl_on_top | solo_focus | spread_legs | sweat | completely_nude | pov |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:----------|:-------|:---------------|:--------------------|:-------|:-----------|:-----------------|:---------|:-----------|:------------|:-----------|:---------|:--------------|:-----------------|:-------|:-----------|:------------|:---------|:-------------|:--------|:-------|:-----------------------|:----------------|:-----------------|:--------|:--------|:-----------------|:--------------|:-------|:---------|:-------------|:--------|:---------------|:-------------------|:----------|:------|:----------|:-------------------|:--------------|:-------------|:--------------|:--------|:------------------|:------|
| 0 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 35 |  |  |  |  |  | X | X | | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | X | X | | | X | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | X | X | X | | | X | | | X | | X | | | | | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | | X | | X | | | | | | | | | | | | X | X | X | | | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
zpn/bace_regression | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: bace_regression
size_categories:
- 1K<n<10K
source_datasets: []
tags:
- bio
- bio-chem
- molnet
- molecule-net
- biophysics
task_categories:
- other
task_ids: []
---
# Dataset Card for bace_regression
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: https://moleculenet.org/**
- **Repository: https://github.com/deepchem/deepchem/tree/master**
- **Paper: https://arxiv.org/abs/1703.00564**
### Dataset Summary
`bace_regression` is a dataset included in [MoleculeNet](https://moleculenet.org/). This dataset consists of Quantitative (IC50) binding results for a set of inhibitors of human β-secretase 1(BACE-1).
## Dataset Structure
### Data Fields
Each split contains
* `smiles`: the [SMILES](https://en.wikipedia.org/wiki/Simplified_molecular-input_line-entry_system) representation of a molecule
* `selfies`: the [SELFIES](https://github.com/aspuru-guzik-group/selfies) representation of a molecule
* `target`: the `IC50` binding results
### Data Splits
The dataset is split into an 80/10/10 train/valid/test split using scaffold split.
### Source Data
#### Initial Data Collection and Normalization
Data was originially generated by the Pande Group at Standford
### Licensing Information
This dataset was originally released under an MIT license
### Citation Information
```
@misc{https://doi.org/10.48550/arxiv.1703.00564,
doi = {10.48550/ARXIV.1703.00564},
url = {https://arxiv.org/abs/1703.00564},
author = {Wu, Zhenqin and Ramsundar, Bharath and Feinberg, Evan N. and Gomes, Joseph and Geniesse, Caleb and Pappu, Aneesh S. and Leswing, Karl and Pande, Vijay},
keywords = {Machine Learning (cs.LG), Chemical Physics (physics.chem-ph), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Physical sciences, FOS: Physical sciences},
title = {MoleculeNet: A Benchmark for Molecular Machine Learning},
publisher = {arXiv},
year = {2017},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
Thanks to [@zanussbaum](https://github.com/zanussbaum) for adding this dataset.
|
El-chapoo/simple-dolly | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6904902
num_examples: 15015
download_size: 4447035
dataset_size: 6904902
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mmhzlrj/Genealogy | ---
license: apache-2.0
language:
- zh
---
数据集包含了一本族谱的封面和164页内容,是竖版的中文简体和繁体字的组合。
The dataset contains the cover and 164 pages of a family tree, which is a combination of simplified and traditional Chinese characters in a vertical version. |
cc92yy3344/vegetable | ---
annotations_creators:
- crowdsourced
language:
- zh
language_creators:
- found
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: "15\u79CD\u852C\u83DC\u6570\u636E\u96C6"
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- "\u852C\u83DC"
- "\u56FE\u50CF\u5206\u7C7B"
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
## 蔬菜图像数据集
### 背景
最初的实验是用世界各地发现的15种常见蔬菜进行的。实验选择的蔬菜有:豆类、苦瓜、葫芦、茄子、西兰花、卷心菜、辣椒、胡萝卜、花椰菜、黄瓜、木瓜、土豆、南瓜、萝卜和番茄。共使用了来自15个类的21000张图像,其中每个类包含1400张尺寸为224×224、格式为*.jpg的图像。数据集中70%用于培训,15%用于验证,15%用于测试。
### 目录
此数据集包含三个文件夹:
- train (15000 张图像)
- test (3000 张图像)
- validation (3000 张图像)
### 数据收集
这个数据集中的图像是我们为一个项目从蔬菜农场和市场收集的。
### 制作元数据文件
运行下面`python`的代码,就可以在桌面生成三个csv格式的元数据文件、一个分类数据文件(需要放入到数据文件中)
```python
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
1.下载的数据文件 Vegetable Images.zip ,并解压到桌面
2.然后执行 python generate.py 即可生成三个元数据文件和一个分类数据文件
"""
import os
from pathlib import Path
category_dict = {
'Bean': '豆类',
'Bitter_Gourd': '苦瓜',
'Bottle_Gourd': '葫芦',
'Brinjal': '茄子',
'Broccoli': '西兰花',
'Cabbage': '卷心菜',
'Capsicum': '辣椒',
'Carrot': '胡萝卜',
'Cauliflower': '花椰菜',
'Cucumber': '黄瓜',
'Papaya': '木瓜',
'Potato': '土豆',
'Pumpkin': '南瓜',
'Radish': '萝卜',
'Tomato': '番茄',
}
base_path = Path.home().joinpath('desktop')
data = '\n'.join((item for item in category_dict.values())) # 注意:利用了python 3.6之后字典插入有序的特性
base_path.joinpath('classname.txt').write_text(data, encoding='utf-8')
def create(filename):
csv_path = base_path.joinpath(f'{filename}.csv')
with csv_path.open('wt', encoding='utf-8', newline='') as csv:
csv.writelines([f'image,category{os.linesep}'])
data_path = base_path.joinpath('Vegetable Images', filename)
batch = 0
datas = []
keys = list(category_dict.keys())
for image_path in data_path.rglob('*.jpg'):
batch += 1
part1 = str(image_path).removeprefix(str(base_path)).replace('\\', '/')[1:]
part2 = keys.index(image_path.parents[0].name)
datas.append(f'{part1},{part2}{os.linesep}')
if batch > 100:
csv.writelines(datas)
datas.clear()
if datas:
csv.writelines(datas)
return csv_path.stat().st_size
if __name__ == '__main__':
print(create('train'))
print(create('test'))
print(create('validation'))
```
### 致谢
非常感谢原始数据集提供方 [Vegetable Image Dataset](https://www.kaggle.com/datasets/misrakahmed/vegetable-image-dataset)。
### 克隆数据
```bash
git clone https://huggingface.co/datasets/cc92yy3344/vegetable.git
``` |
autoevaluate/autoeval-eval-futin__feed-sen_en_-7dbe88-2245971656 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-6.7b
metrics: []
dataset_name: futin/feed
dataset_config: sen_en_
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-6.7b
* Dataset: futin/feed
* Config: sen_en_
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
TExtPhish/TExtPhish | ---
license: cc-by-nc-nd-4.0
task_categories:
- text-classification
- sentence-similarity
language:
- en
tags:
- security
- ML
- NLP
- sentiment
pretty_name: TExtPhish
extra_gated_heading: "You need to agree to share your contact information to access TExtPhish"
extra_gated_prompt: "The emails in the **TExtPhish** Email Collection Corpus are under the license of ***cc-by-nc-nd-4.0***, and their use is governed by the following agreements: \n - You agree to not distribute or reproduce any derivatives, in whole or in part, any document from the Collection. \n- You agree to not attempt to identify, or speculate on the identity of, any individual in **TExtPhish** Collection, even if that information is available from public sources.\n - Re-use of this data is also subject to Reddit API terms which includes: \n * not encouraging or promoting illegal activity. \n * not using this dataset with the intent of introducing any viruses, worms, defects, Trojan horses, malware, or any other items of a destructive nature. \n * no selling, leasing, or sublicensing this data whether for direct commercial or monetary gain. \n\nIn the event that End User violates the terms of this agreement, then upon notice from the dataset maintainers, end users shall cease use of the collection and destroy all copies of the collection and other documents that contain excerpts from the Collection.\n\nWe would like to keep track of this dataset users for statistics purposes (how many users and affiliations) and agreement only."
I agree to use TExtPhish dataset for non-commercial intended use ONLY: checkbox
extra_gated_button_content: "Acknowledge License"
viewer: False
---
# Dataset Card for TExtPhish
## Dataset Description
### Dataset Summary
This dataset card aims to describe the **TExtPhish** collection and its intended use.
### Languages
The current version only includes data samples in English, as spoken partially by Reddit users on the [r/Scams](https://www.reddit.com/r/Scams/comments/n00kg3/the_blackmail_email_scam_part_7/###) blackmail subreddits.
In the Future, we would like to explore more in different languages. Collaborators are encouraged to contact the authors to extend the current version with more diverse extortion emails in different languages.
## Dataset Structure
### Initial Data Collection and Sanitization
First, we select benign samples from the publicly available dataset, such as Enron and SpamAssassin.
We extract each email from email threads and tokenize personally sensitive information using name entity recognition, regular expression and synthetically replaced information.
Second, we collect extortion attacks from reddit posts |[r/Scams](https://www.reddit.com/r/Scams/comments/n00kg3/the_blackmail_email_scam_part_7/###) and botnet ransomware emails from |[Malware Traffic Analysis repository](https://www.malware-traffic-analysis.net).
We remove unecessary comment from the reddit thread and we only keep extortion emails.
To make the dataset challenging, we keep only the most semantically similar benign emails to the extortion attacks.
For semantic textual similarity, we first applied sentence transformers (SBERT) to get contextual sentence embeddings of benign and extortion samples.
Then, we apply the Facebook AI Similarity Search (FAISS) measure to search for similar benign instances to extortion attacks.
### Data Instances
|Extortion Class| Examples from Sentence-level subset|
|---|---|
|Blackmail| - I will delete the corresponding recording and I will not blackmail you ever again.|
|Ransomware| - Tap to Download Attachment Xinalink_servicescom (10.3 KB).|
|Sextortion| - In case you ignore me, within 96 h, ur sex tape will be posted on the net.|
### Data Sources
The following tables describe the data sources used to generate this dataset.
* **Extortion Data**
|Source|Total number of Emails| Total number of Sentences|
|---|---|---|
|[r/Scams](https://www.reddit.com/r/Scams/comments/n00kg3/the_blackmail_email_scam_part_7/###) Extortion Emails | 1,113 | 17,393 |
|Botnet Ransomware Emails | 150 | 1,510 |
* **Benign Data**
|Source|Total number of Emails| Total number of Sentences|
|---|---|---|
|[Enron](https://www.cs.cmu.edu/~enron/)| 1,360 | 26,835 |
|[SpamAssasin](https://spamassassin.apache.org/old/publiccorpus/)| 1,010 | 12,348 |
### Data Fields
The dataset is structered as follow:
list[{
"src": str, # Data source (e.g, SpamAssassin, Enron, Reddit)
"content": str, # Content (sentence-level or email-level)
"label": str, # Extortion label (blackmail, ransomware, sextortion) or benign label
}]
### Loading TExtPhish Dataset
To load the email-level subset, use the following instructions:
email_subset = load_dataset("TExtPhish/TExtPhish", data_dir="email-level", split="train", sep=";")
To load the sentence-level subset, use the following instructions:
sentence_subset = load_dataset("TExtPhish/TExtPhish", data_dir="sentence-level", split="train", sep=";")
To load the Homograph-Perturbed subset on sentences, use the following instructions:
homograph_subset = load_dataset("TExtPhish/TExtPhish", data_dir="homograph-perturbed-sentences", split="train", sep=";")
### Splitting TExtPhish Dataset
If you would like to load the dataset under cross validation setting,
you can load (train or test) which will be divided into k folds (example below k=10).
test_folds = load_dataset('TExtPhish/TExtPhish', split=[f"train[{k}%:{k+10}%]" for k in range(0, 100, 10)], data_dir="sentence-level", sep=';')
train_folds = load_dataset('TExtPhish/TExtPhish',split=[f"train[:{k}%]+train[{k+10}%:]" for k in range(0, 100, 10)], data_dir="sentence-level", sep=';')
This easy and ready-to-use divided folds consist of dividing randomly TExtPhish into k=10 parts.
Nine of these parts are used for training while one tenth is reserved for testing.
This procedure will be repeated k=10 times each time reserving a different tenth for testing. In other words, each testing set is a 10% chunk, and the training set makes up the remaining complementary 90% chunk.
### Binarize Labels
from sklearn.preprocessing import LabelEncoder
# Transforming text labels to encoded labels using the MultiLabelBinarizer
multibin = LabelEncoder()
Y_train = multibin.fit_transform(Y_train)
Y_test = multibin.fit_transform(Y_test)
### Personal and Sensitive Information
We ensure to remove any personal and sensitive information before uploading our dataset.
The emails provided in this corpus are stripped from sensitive information that are replaced with tokens (e.g., url_token), synthetically replaced, or originally obfuscated (***) in order to anonymize the data.
## Considerations for Using the Data
### Intended Uses
Our collection may only be used for linguistic non-profit research including but not limited to Information Retrieval, Text Classification, Natural Language Processing, Machine Learning, Phishing Detection, Data Privacy and Security, and like fields.
### Social Impact of Dataset
Users are totally responsible for any misuse of the dataset that goes against the original intended use of this dataset.
The extortion dataset should not be used for any harmful means to institute and propagate attacks.
*Positive Social Impact*
* Researchers can use **TExtPhish** to study the tactics and techniques used by attackers, identify vulnerabilities, and develop effective countermeasures against extortion.
* Educators can use **TExtPhish** to teach students about online safety, how to recognize phishing extortion attempts, and best practices for protecting personal information and financial loss.
* Cybersecurity professionals can use **TExtPhish** to train machine learning models to detect and block phishing emails with money extortion attempts, improving incident response strategies, and minimizing financial loss exposure.
*Negative Social Impact*
* Attackers might use **TExtPhish** to create automatic botnets that generate better extortion attacks.
* Attackers might use **TExtPhish** to propagate deception and propaganda online.
* Attackers might attempt to use **TExtPhish** as an initializing phase to perform malware, ransomware, or embed trojans within a targeted system to gain remote access.
## Additional Information
### Licensing Information
As the maintainers of this dataset, we choose to follow licensing Attribution- NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) to ensure that the dataset is non-commercial and it cannot be distributed or reproduced, in whole or in part, any document from the Collection.
A portion of our dataset was downloaded using Reddit's API Wrapper through the PRAW package for the python programming language. Re-use of this data is subject to Reddit API terms, which include:
* Users shall not encourage or promote illegal activity throughout the use of this dataset.
* Users shall not use this dataset with the intent of introducing any viruses, worms, defects, Trojan horses, malware, or any other items of a destructive nature.
* Users shall not sell, lease, or sublicense this data whether for direct commercial or monetary gain.
### Citation Information
Information about citation will soon be updated.
|
huggingnft/nftrex | ---
tags:
- huggingnft
- nft
- huggan
- gan
- image
- images
task:
- unconditional-image-generation
datasets:
- huggingnft/nftrex
license: mit
---
# Dataset Card
## Disclaimer
All rights belong to their owners.
Models and datasets can be removed from the site at the request of the copyright holder.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingnft](https://github.com/AlekseyKorshuk/huggingnft)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
NFT images dataset for unconditional generation.
NFT collection available [here](https://opensea.io/collection/nftrex).
Model is available [here](https://huggingface.co/huggingnft/nftrex).
Check Space: [link](https://huggingface.co/spaces/AlekseyKorshuk/huggingnft).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingnft/nftrex")
```
## Dataset Structure
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
The data fields are the same among all splits.
- `image`: an `image` feature.
- `id`: an `int` feature.
- `token_metadata`: a `str` feature.
- `image_original_url`: a `str` feature.
### Data Splits
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingnft,
author={Aleksey Korshuk}
year=2022
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingnft)
|
freshpearYoon/vr_train_free_34 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6616696583
num_examples: 10000
download_size: 1039242942
dataset_size: 6616696583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TigerResearch/tigerbot-zhihu-zh-10k | ---
license: apache-2.0
language:
- zh
---
[Tigerbot](https://github.com/TigerResearch/TigerBot) 基于开源搜集的知乎数据生成的sft问答对
## Usage
```python
import datasets
ds_sft = datasets.load_dataset('TigerResearch/tigerbot-zhihu-zh-10k')
``` |
ds4sd/DocLayNet | ---
annotations_creators:
- crowdsourced
license: other
pretty_name: DocLayNet
size_categories:
- 10K<n<100K
tags:
- layout-segmentation
- COCO
- document-understanding
- PDF
task_categories:
- object-detection
- image-segmentation
task_ids:
- instance-segmentation
---
# Dataset Card for DocLayNet
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://developer.ibm.com/exchanges/data/all/doclaynet/
- **Repository:** https://github.com/DS4SD/DocLayNet
- **Paper:** https://doi.org/10.1145/3534678.3539043
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
DocLayNet provides page-by-page layout segmentation ground-truth using bounding-boxes for 11 distinct class labels on 80863 unique pages from 6 document categories. It provides several unique features compared to related work such as PubLayNet or DocBank:
1. *Human Annotation*: DocLayNet is hand-annotated by well-trained experts, providing a gold-standard in layout segmentation through human recognition and interpretation of each page layout
2. *Large layout variability*: DocLayNet includes diverse and complex layouts from a large variety of public sources in Finance, Science, Patents, Tenders, Law texts and Manuals
3. *Detailed label set*: DocLayNet defines 11 class labels to distinguish layout features in high detail.
4. *Redundant annotations*: A fraction of the pages in DocLayNet are double- or triple-annotated, allowing to estimate annotation uncertainty and an upper-bound of achievable prediction accuracy with ML models
5. *Pre-defined train- test- and validation-sets*: DocLayNet provides fixed sets for each to ensure proportional representation of the class-labels and avoid leakage of unique layout styles across the sets.
### Supported Tasks and Leaderboards
We are hosting a competition in ICDAR 2023 based on the DocLayNet dataset. For more information see https://ds4sd.github.io/icdar23-doclaynet/.
## Dataset Structure
### Data Fields
DocLayNet provides four types of data assets:
1. PNG images of all pages, resized to square `1025 x 1025px`
2. Bounding-box annotations in COCO format for each PNG image
3. Extra: Single-page PDF files matching each PNG image
4. Extra: JSON file matching each PDF page, which provides the digital text cells with coordinates and content
The COCO image record are defined like this example
```js
...
{
"id": 1,
"width": 1025,
"height": 1025,
"file_name": "132a855ee8b23533d8ae69af0049c038171a06ddfcac892c3c6d7e6b4091c642.png",
// Custom fields:
"doc_category": "financial_reports" // high-level document category
"collection": "ann_reports_00_04_fancy", // sub-collection name
"doc_name": "NASDAQ_FFIN_2002.pdf", // original document filename
"page_no": 9, // page number in original document
"precedence": 0, // Annotation order, non-zero in case of redundant double- or triple-annotation
},
...
```
The `doc_category` field uses one of the following constants:
```
financial_reports,
scientific_articles,
laws_and_regulations,
government_tenders,
manuals,
patents
```
### Data Splits
The dataset provides three splits
- `train`
- `val`
- `test`
## Dataset Creation
### Annotations
#### Annotation process
The labeling guideline used for training of the annotation experts are available at [DocLayNet_Labeling_Guide_Public.pdf](https://raw.githubusercontent.com/DS4SD/DocLayNet/main/assets/DocLayNet_Labeling_Guide_Public.pdf).
#### Who are the annotators?
Annotations are crowdsourced.
## Additional Information
### Dataset Curators
The dataset is curated by the [Deep Search team](https://ds4sd.github.io/) at IBM Research.
You can contact us at [deepsearch-core@zurich.ibm.com](mailto:deepsearch-core@zurich.ibm.com).
Curators:
- Christoph Auer, [@cau-git](https://github.com/cau-git)
- Michele Dolfi, [@dolfim-ibm](https://github.com/dolfim-ibm)
- Ahmed Nassar, [@nassarofficial](https://github.com/nassarofficial)
- Peter Staar, [@PeterStaar-IBM](https://github.com/PeterStaar-IBM)
### Licensing Information
License: [CDLA-Permissive-1.0](https://cdla.io/permissive-1-0/)
### Citation Information
```bib
@article{doclaynet2022,
title = {DocLayNet: A Large Human-Annotated Dataset for Document-Layout Segmentation},
doi = {10.1145/3534678.353904},
url = {https://doi.org/10.1145/3534678.3539043},
author = {Pfitzmann, Birgit and Auer, Christoph and Dolfi, Michele and Nassar, Ahmed S and Staar, Peter W J},
year = {2022},
isbn = {9781450393850},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
booktitle = {Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining},
pages = {3743–3751},
numpages = {9},
location = {Washington DC, USA},
series = {KDD '22}
}
```
### Contributions
Thanks to [@dolfim-ibm](https://github.com/dolfim-ibm), [@cau-git](https://github.com/cau-git) for adding this dataset.
|
open-llm-leaderboard/details_openlm-research__open_llama_3b_v2 | ---
pretty_name: Evaluation run of openlm-research/open_llama_3b_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openlm-research/open_llama_3b_v2](https://huggingface.co/openlm-research/open_llama_3b_v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openlm-research__open_llama_3b_v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T11:22:56.677003](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b_v2/blob/main/results_2023-10-15T11-22-56.677003.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05134962248322172,\n\
\ \"f1_stderr\": 0.0012730168443049574,\n \"acc\": 0.3395923103113801,\n\
\ \"acc_stderr\": 0.007914879526646601\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n\
\ \"f1\": 0.05134962248322172,\n \"f1_stderr\": 0.0012730168443049574\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.002615326510775673\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.67008681925809,\n \"acc_stderr\": 0.013214432542517527\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openlm-research/open_llama_3b_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T11_22_56.677003
path:
- '**/details_harness|drop|3_2023-10-15T11-22-56.677003.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T11-22-56.677003.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T11_22_56.677003
path:
- '**/details_harness|gsm8k|5_2023-10-15T11-22-56.677003.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T11-22-56.677003.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:28:09.665576.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T10:28:09.665576.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T11_22_56.677003
path:
- '**/details_harness|winogrande|5_2023-10-15T11-22-56.677003.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T11-22-56.677003.parquet'
- config_name: results
data_files:
- split: 2023_07_24T10_28_09.665576
path:
- results_2023-07-24T10:28:09.665576.parquet
- split: 2023_10_15T11_22_56.677003
path:
- results_2023-10-15T11-22-56.677003.parquet
- split: latest
path:
- results_2023-10-15T11-22-56.677003.parquet
---
# Dataset Card for Evaluation run of openlm-research/open_llama_3b_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openlm-research/open_llama_3b_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openlm-research/open_llama_3b_v2](https://huggingface.co/openlm-research/open_llama_3b_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openlm-research__open_llama_3b_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T11:22:56.677003](https://huggingface.co/datasets/open-llm-leaderboard/details_openlm-research__open_llama_3b_v2/blob/main/results_2023-10-15T11-22-56.677003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.05134962248322172,
"f1_stderr": 0.0012730168443049574,
"acc": 0.3395923103113801,
"acc_stderr": 0.007914879526646601
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.05134962248322172,
"f1_stderr": 0.0012730168443049574
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775673
},
"harness|winogrande|5": {
"acc": 0.67008681925809,
"acc_stderr": 0.013214432542517527
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TheVarunKaushik/Valorant_Advice | ---
license: openrail
---
|
CyberHarem/tachibana_nina_citrus | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tachibana Nina
This is the dataset of Tachibana Nina, containing 44 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 44 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 102 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 133 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 44 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 44 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 44 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 102 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 102 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 83 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 133 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 133 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
celinelee/parens_count | ---
dataset_info:
features:
- name: code
dtype: string
- name: open_paren_count
dtype: int64
splits:
- name: train
num_bytes: 108462969
num_examples: 100443
download_size: 8385368
dataset_size: 108462969
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aisikoduro/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/harutsuki_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of harutsuki/春月/春月 (Azur Lane)
This is the dataset of harutsuki/春月/春月 (Azur Lane), containing 47 images and their tags.
The core tags of this character are `long_hair, very_long_hair, bow, hair_bow, animal_ears, bangs, yellow_eyes, black_hair, breasts, brown_eyes, brown_hair, hair_ornament, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 47 | 85.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harutsuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 47 | 47.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harutsuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 115 | 92.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harutsuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 47 | 75.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harutsuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 115 | 131.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/harutsuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/harutsuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, skirt, white_pantyhose, full_body, :d, black_footwear, frills, obi, wide_sleeves, bare_shoulders, blush, long_sleeves, standing, white_kimono, detached_sleeves, official_alternate_costume, sleeveless_kimono |
| 1 | 6 |  |  |  |  |  | 1girl, detached_sleeves, open_mouth, red_skirt, solo, bare_shoulders, hakama_short_skirt, blush, looking_at_viewer, miko, white_background, wide_sleeves, red_bow, red_hakama, ribbon, simple_background, smile, thighs, white_socks |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | open_mouth | skirt | white_pantyhose | full_body | :d | black_footwear | frills | obi | wide_sleeves | bare_shoulders | blush | long_sleeves | standing | white_kimono | detached_sleeves | official_alternate_costume | sleeveless_kimono | red_skirt | hakama_short_skirt | miko | white_background | red_bow | red_hakama | ribbon | simple_background | smile | thighs | white_socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:--------|:------------------|:------------|:-----|:-----------------|:---------|:------|:---------------|:-----------------|:--------|:---------------|:-----------|:---------------|:-------------------|:-----------------------------|:--------------------|:------------|:---------------------|:-------|:-------------------|:----------|:-------------|:---------|:--------------------|:--------|:---------|:--------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b | ---
pretty_name: Evaluation run of PocketDoc/Dans-AdventurousWinds-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PocketDoc/Dans-AdventurousWinds-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T16:13:28.760766](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b/blob/main/results_2023-10-24T16-13-28.760766.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32791526845637586,\n\
\ \"em_stderr\": 0.004807646038011011,\n \"f1\": 0.3764691694630872,\n\
\ \"f1_stderr\": 0.004686966609320671,\n \"acc\": 0.46954983116649207,\n\
\ \"acc_stderr\": 0.010810156337777745\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.32791526845637586,\n \"em_stderr\": 0.004807646038011011,\n\
\ \"f1\": 0.3764691694630872,\n \"f1_stderr\": 0.004686966609320671\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15693707354056102,\n \
\ \"acc_stderr\": 0.010019246595616167\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|arc:challenge|25_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T16_13_28.760766
path:
- '**/details_harness|drop|3_2023-10-24T16-13-28.760766.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T16-13-28.760766.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T16_13_28.760766
path:
- '**/details_harness|gsm8k|5_2023-10-24T16-13-28.760766.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T16-13-28.760766.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hellaswag|10_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T04-04-57.551374.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T04-04-57.551374.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T16_13_28.760766
path:
- '**/details_harness|winogrande|5_2023-10-24T16-13-28.760766.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T16-13-28.760766.parquet'
- config_name: results
data_files:
- split: 2023_10_10T04_04_57.551374
path:
- results_2023-10-10T04-04-57.551374.parquet
- split: 2023_10_24T16_13_28.760766
path:
- results_2023-10-24T16-13-28.760766.parquet
- split: latest
path:
- results_2023-10-24T16-13-28.760766.parquet
---
# Dataset Card for Evaluation run of PocketDoc/Dans-AdventurousWinds-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PocketDoc/Dans-AdventurousWinds-7b](https://huggingface.co/PocketDoc/Dans-AdventurousWinds-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T16:13:28.760766](https://huggingface.co/datasets/open-llm-leaderboard/details_PocketDoc__Dans-AdventurousWinds-7b/blob/main/results_2023-10-24T16-13-28.760766.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.32791526845637586,
"em_stderr": 0.004807646038011011,
"f1": 0.3764691694630872,
"f1_stderr": 0.004686966609320671,
"acc": 0.46954983116649207,
"acc_stderr": 0.010810156337777745
},
"harness|drop|3": {
"em": 0.32791526845637586,
"em_stderr": 0.004807646038011011,
"f1": 0.3764691694630872,
"f1_stderr": 0.004686966609320671
},
"harness|gsm8k|5": {
"acc": 0.15693707354056102,
"acc_stderr": 0.010019246595616167
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
davo15/test_ragas | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: contexts
sequence: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 128529
num_examples: 39
download_size: 74968
dataset_size: 128529
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LuangMV97/Empathetic_counseling_Dataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 9143613.730886951
num_examples: 30937
- name: test
num_bytes: 4445059.587284399
num_examples: 7736
download_size: 10363922
dataset_size: 13588673.31817135
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- text-generation
tags:
- medical
---
# Dataset Card for Dataset Name
Empathetic_counseling is a dataset intended for training conversational language models for generating text in empathetic and mental counseling dialogues.
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
This is a dataset resulting after concatenating some examples from the "empathetic_dialogues" dataset with a dataset resulting from the combination between "Amod/mental_health_counseling_conversations", "EmoCareAI/Psych8k" and "https://github.com/nbertagnolli/counsel-chat.git".
It is composed of "input" and "label" columns, where the first one is a user utterance and the second one is the response the model is expected to predict. It sought to adapt a set of examples with an input about situations that a person is experiencing for a given emotion and its respective output which is the empathic or counseling response.
- **Language(s) (NLP):** English
- **License:** [More Information Needed]
## Uses
Empathetic_counseling is a dataset intended for training conversational language models for text-generation task in empathetic and mental counseling dialogues.
### Direct Use
Use cases:
- Chatbot
- Virtual assistant.
- Emotional counseling conversations.
## Dataset Structure
The dataset has 38673 rows, divided into 80% for "train" (30937) and 20% for "test" (7736). The number of examples for each subset is described as follows:
- empathetic_dialogues: train: 19880, test: 4970.
- Amod/mental_health_counseling_conversations: train: 2805, test: 702.
- EmoCareAI/Psych8k: train: 6549, test: 1638.
- nbertagnolli/counsel-chat (GitHub repository): train: 1703, test: 426.
## Dataset Creation
### Curation Rationale
The motivation for creating the dataset was to train an encoder-decoder model, taking FacebookAI/roberta-base as encoder and microsoft/DialoGPT-medium as decoder, serving as the language model for the text-generation task of a master's final project.
#### Data Collection and Processing
A preprocessing was performed by eliminating unnecessary columns and missing values.
The purpose of not taking the complete EmpatheticDialogues dataset is to have a better balance in the number of rows with the rest of the resulting dataset; the number of examples mentioned in their original paper was taken.
**APA:**
[More Information Needed]
## Dataset Card Authors [optional]
The Dataset author is Luis Angel Motta Valero, VIU student.
## Dataset Card Contact
For more information and contact: luisangel.motta@alumnos.viu.es or luchomotta97@gmail.com |
trl-lib/ultrachat_200k_chatml | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1224177384
num_examples: 207865
- name: test
num_bytes: 135316994
num_examples: 23110
download_size: 676202243
dataset_size: 1359494378
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_arlineka__Brunhilde-13b | ---
pretty_name: Evaluation run of arlineka/Brunhilde-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [arlineka/Brunhilde-13b](https://huggingface.co/arlineka/Brunhilde-13b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arlineka__Brunhilde-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T11:54:25.541681](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b/blob/main/results_2024-02-14T11-54-25.541681.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5588896471810725,\n\
\ \"acc_stderr\": 0.03358524149192356,\n \"acc_norm\": 0.5671325395608912,\n\
\ \"acc_norm_stderr\": 0.034363791698055104,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n\
\ \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436178,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6406094403505278,\n\
\ \"acc_stderr\": 0.004788412062375697,\n \"acc_norm\": 0.8348934475204143,\n\
\ \"acc_norm_stderr\": 0.0037051790292873302\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009787,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009787\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.015438083080568973,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.015438083080568973\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n\
\ \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n\
\ \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702335,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702335\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\
\ \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n\
\ \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n\
\ \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \
\ \"acc_stderr\": 0.007950942148339338\n }\n}\n```"
repo_url: https://huggingface.co/arlineka/Brunhilde-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-54-25.541681.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- '**/details_harness|winogrande|5_2024-02-14T11-54-25.541681.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T11-54-25.541681.parquet'
- config_name: results
data_files:
- split: 2024_02_14T11_54_25.541681
path:
- results_2024-02-14T11-54-25.541681.parquet
- split: latest
path:
- results_2024-02-14T11-54-25.541681.parquet
---
# Dataset Card for Evaluation run of arlineka/Brunhilde-13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-13b](https://huggingface.co/arlineka/Brunhilde-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arlineka__Brunhilde-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:54:25.541681](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b/blob/main/results_2024-02-14T11-54-25.541681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5588896471810725,
"acc_stderr": 0.03358524149192356,
"acc_norm": 0.5671325395608912,
"acc_norm_stderr": 0.034363791698055104,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436178,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938169
},
"harness|hellaswag|10": {
"acc": 0.6406094403505278,
"acc_stderr": 0.004788412062375697,
"acc_norm": 0.8348934475204143,
"acc_norm_stderr": 0.0037051790292873302
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009787,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009787
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.015438083080568973,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.015438083080568973
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702335,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702335
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.0266756119260371,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.0266756119260371
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zhijian12345/cat_classifiter | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_54 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1322438228.0
num_examples: 259709
download_size: 1351828700
dataset_size: 1322438228.0
---
# Dataset Card for "chunk_54"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
epicdev/DreamDiffusion | ---
license: mit
---
|
korexyz/unsplash-people-v2 | ---
dataset_info:
features:
- name: url
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1514875.0
num_examples: 5970
download_size: 417162
dataset_size: 1514875.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
malucoelhaofc/TolkienV2 | ---
license: openrail
---
|
joey234/mmlu-anatomy-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 33692
num_examples: 135
download_size: 19850
dataset_size: 33692
---
# Dataset Card for "mmlu-anatomy-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_208 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 22526137440.75
num_examples: 234530
download_size: 20948738535
dataset_size: 22526137440.75
---
# Dataset Card for "chunk_208"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/cortex_quantum | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: responses
dtype: string
splits:
- name: train
num_bytes: 44328708
num_examples: 13630
download_size: 22913135
dataset_size: 44328708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_DreadPoor__ComplectMaid-7B-slerp | ---
pretty_name: Evaluation run of DreadPoor/ComplectMaid-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/ComplectMaid-7B-slerp](https://huggingface.co/DreadPoor/ComplectMaid-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__ComplectMaid-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T01:45:12.890226](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ComplectMaid-7B-slerp/blob/main/results_2024-03-13T01-45-12.890226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504630913057503,\n\
\ \"acc_stderr\": 0.03223222580128249,\n \"acc_norm\": 0.6509980154908607,\n\
\ \"acc_norm_stderr\": 0.0328925449766911,\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.017501285074551835,\n \"mc2\": 0.6587617344262339,\n\
\ \"mc2_stderr\": 0.015090718639320998\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n\
\ \"acc_norm\": 0.6996587030716723,\n \"acc_norm_stderr\": 0.013395909309957007\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6989643497311293,\n\
\ \"acc_stderr\": 0.00457770702503138,\n \"acc_norm\": 0.8734315873332006,\n\
\ \"acc_norm_stderr\": 0.003318093579702919\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725197,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725197\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297794,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297794\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n\
\ \"acc_stderr\": 0.01642881191589886,\n \"acc_norm\": 0.40670391061452515,\n\
\ \"acc_norm_stderr\": 0.01642881191589886\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146294,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146294\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4920440636474908,\n\
\ \"mc1_stderr\": 0.017501285074551835,\n \"mc2\": 0.6587617344262339,\n\
\ \"mc2_stderr\": 0.015090718639320998\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047989\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \
\ \"acc_stderr\": 0.013059111935831504\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/ComplectMaid-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|arc:challenge|25_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|gsm8k|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hellaswag|10_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-45-12.890226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T01-45-12.890226.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- '**/details_harness|winogrande|5_2024-03-13T01-45-12.890226.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T01-45-12.890226.parquet'
- config_name: results
data_files:
- split: 2024_03_13T01_45_12.890226
path:
- results_2024-03-13T01-45-12.890226.parquet
- split: latest
path:
- results_2024-03-13T01-45-12.890226.parquet
---
# Dataset Card for Evaluation run of DreadPoor/ComplectMaid-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/ComplectMaid-7B-slerp](https://huggingface.co/DreadPoor/ComplectMaid-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__ComplectMaid-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T01:45:12.890226](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ComplectMaid-7B-slerp/blob/main/results_2024-03-13T01-45-12.890226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504630913057503,
"acc_stderr": 0.03223222580128249,
"acc_norm": 0.6509980154908607,
"acc_norm_stderr": 0.0328925449766911,
"mc1": 0.4920440636474908,
"mc1_stderr": 0.017501285074551835,
"mc2": 0.6587617344262339,
"mc2_stderr": 0.015090718639320998
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.6996587030716723,
"acc_norm_stderr": 0.013395909309957007
},
"harness|hellaswag|10": {
"acc": 0.6989643497311293,
"acc_stderr": 0.00457770702503138,
"acc_norm": 0.8734315873332006,
"acc_norm_stderr": 0.003318093579702919
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318994,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725197,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725197
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297794,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.01642881191589886,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.01642881191589886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146294,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146294
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4920440636474908,
"mc1_stderr": 0.017501285074551835,
"mc2": 0.6587617344262339,
"mc2_stderr": 0.015090718639320998
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047989
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831504
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ccdv/mediasum | ---
language:
- en
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
task_categories:
- summarization
- text2text-generation
task_ids: []
tags:
- conditional-text-generation
---
# MediaSum dataset for summarization
Summarization dataset copied from [MediaSum: A Large-scale Media Interview Dataset for Dialogue Summarization](https://github.com/zcgzcgzcg1/MediaSum)
This dataset is compatible with the [`run_summarization.py`](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization) script from Transformers if you add this line to the `summarization_name_mapping` variable:
```python
"ccdv/mediasum": ("document", "summary")
```
# Configs
4 possibles configs:
- `roberta` will concatenate documents with "\</s\>"
- `newline` will concatenate documents with "\n"
- `bert` will concatenate documents with "[SEP]"
- `list` will return the list of documents instead of a single string
Add `_prepended` to config name to prepend the speaker name before each dialogue: `speaker: text` \
Default is `roberta_prepended` (compatible with BART).
### Data Fields
- `id`: paper id
- `document`: a string/list containing the body of a set of documents
- `summary`: a string containing the abstract of the set
### Data Splits
This dataset has 3 splits: _train_, _validation_, and _test_. \
| Dataset Split | Number of Instances |
| ------------- | --------------------|
| Train | 443596 |
| Validation | 10000 |
| Test | 10000 |
# Cite original article
```
@article{zhu2021mediasum,
title={MediaSum: A Large-scale Media Interview Dataset for Dialogue Summarization},
author={Zhu, Chenguang and Liu, Yang and Mei, Jie and Zeng, Michael},
journal={arXiv preprint arXiv:2103.06410},
year={2021}
}
``` |
kartikmosaicml/omi_500k_simple_processed | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 331274795.75451076
num_examples: 500000
download_size: 145762142
dataset_size: 331274795.75451076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/2c4acff4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1339
dataset_size: 182
---
# Dataset Card for "2c4acff4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yotam56/hugo_suits_ds | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Subfolder_1
'1': Subfolder_10
'2': Subfolder_11
'3': Subfolder_12
'4': Subfolder_13
'5': Subfolder_14
'6': Subfolder_15
'7': Subfolder_16
'8': Subfolder_17
'9': Subfolder_18
'10': Subfolder_2
'11': Subfolder_3
'12': Subfolder_4
'13': Subfolder_5
'14': Subfolder_6
'15': Subfolder_7
'16': Subfolder_8
'17': Subfolder_9
splits:
- name: train
num_bytes: 862857.0
num_examples: 91
download_size: 859535
dataset_size: 862857.0
---
# Dataset Card for "hugo_suits_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pytc/public | ---
license: mit
---
|
AddictiveFuture/sd15-useful-embeddings | ---
task_categories:
- text-to-image
language:
- en
tags:
- art
---
# Header test |
priyank-m/trdg_dict_random_words_en_text_recognition | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 3435853878.0
num_examples: 115000
download_size: 3436541480
dataset_size: 3435853878.0
---
# Dataset Card for "trdg_random_words_en_text_recognition"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_title_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 238057
num_examples: 150
- name: validation
num_bytes: 60056
num_examples: 48
download_size: 72691
dataset_size: 298113
---
# Dataset Card for "squad_title_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abdiharyadi/id_panl_bppt_with_amrbart_amr | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- id
- name: topic
dtype:
class_label:
names:
'0': Economy
'1': International
'2': Science
'3': Sport
- name: amr
dtype: string
splits:
- name: train
num_bytes: 365469
num_examples: 1220
download_size: 170150
dataset_size: 365469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "id_panl_bppt_with_amrbart_amr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleinaNyan/test | ---
configs:
- config_name: accepted
data_dir: accepted
- config_name: rejected
data_dir: rejected
--- |
unography/stock-images-bg-removed-10k-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
splits:
- name: train
num_bytes: 4890286182.26
num_examples: 10799
- name: test
num_bytes: 2584905.0
num_examples: 20
download_size: 4871183022
dataset_size: 4892871087.26
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
facat/sft-train-samples | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: output
dtype: string
- name: task
dtype: string
- name: name
dtype: string
splits:
- name: train
num_bytes: 7997635
num_examples: 2420
download_size: 4481416
dataset_size: 7997635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sft-train-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ranimeree/OriginalTrainSynthTest | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 495211607.205
num_examples: 2769
- name: validation
num_bytes: 61731350.0
num_examples: 352
- name: test
num_bytes: 39333114.0
num_examples: 360
download_size: 588275572
dataset_size: 596276071.2049999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/ead6d7ef | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1341
dataset_size: 182
---
# Dataset Card for "ead6d7ef"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KyS/SpeakerEmbedding0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Speakers
dtype: string
- name: Audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 29234994
num_examples: 46
download_size: 7249854
dataset_size: 29234994
---
# Dataset Card for "SpeakerEmbedding0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adamjweintraut/bart-finetuned-lyrlen-128-tokens_2024-03-22_run | ---
dataset_info:
features:
- name: id
dtype: int64
- name: syllable_counts
dtype: string
- name: predicted
dtype: string
- name: label
dtype: string
- name: rougeL_min_precision
dtype: float64
- name: rougeL_min_recall
dtype: float64
- name: rougeL_min_fmeasure
dtype: float64
- name: rougeL_median_precision
dtype: float64
- name: rougeL_median_recall
dtype: float64
- name: rougeL_median_fmeasure
dtype: float64
- name: rougeL_max_precision
dtype: float64
- name: rougeL_max_recall
dtype: float64
- name: rougeL_max_fmeasure
dtype: float64
- name: predicted_label_sim
dtype: float32
splits:
- name: train
num_bytes: 681709
num_examples: 300
download_size: 304493
dataset_size: 681709
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
WillHeld/librispeech_parquet | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: test
num_bytes: 367966786.42
num_examples: 2620
- name: validation
num_bytes: 359841018.966
num_examples: 2703
- name: train.100
num_bytes: 6622513525.062
num_examples: 28539
- name: train.360
num_bytes: 23908576855.828
num_examples: 104014
- name: train.500
num_bytes: 31825046131.584
num_examples: 148688
- name: train.960
num_bytes: 62356128107.863
num_examples: 281241
download_size: 121680142766
dataset_size: 125440072425.72299
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: train.100
path: data/train.100-*
- split: train.360
path: data/train.360-*
- split: train.500
path: data/train.500-*
- split: train.960
path: data/train.960-*
---
|
ovior/twitter_dataset_1712996654 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2280427
num_examples: 6939
download_size: 1292006
dataset_size: 2280427
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AiForTheChurch/catholic_denomination_300 | ---
dataset_info:
features:
- name: user
dtype: string
- name: llm
dtype: string
splits:
- name: train
num_bytes: 172156
num_examples: 300
download_size: 91806
dataset_size: 172156
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "catholic_denomination_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pphuc25/khanhdinhpham | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1137699
num_examples: 58
download_size: 521927
dataset_size: 1137699
---
# Dataset Card for "khanhdinhpham"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yeonggeunjang/carrotPerMiles | ---
license: apache-2.0
---
|
coastalcph/fm_updates | ---
dataset_info:
features:
- name: query
struct:
- name: label
dtype: string
- name: objects
list:
- name: label
dtype: string
- name: qid
dtype: string
- name: qid
dtype: string
- name: rel_id
dtype: string
- name: relation
dtype: string
- name: prediction
struct:
- name: predictions
list:
- name: answer
dtype: string
- name: first_token_probability
dtype: float64
- name: per_token_probability
sequence: float64
- name: perplexity
dtype: float64
- name: query
dtype: string
- name: relation
dtype: string
- name: type
dtype: string
- name: updates
sequence: string
splits:
- name: train
num_bytes: 1525467
num_examples: 5080
download_size: 606338
dataset_size: 1525467
---
# Dataset Card for "fm_updates"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShoukanLabs/OpenNiji-275001_310000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: url
dtype: string
- name: prompt
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 51817025617.879
num_examples: 34999
download_size: 56134489081
dataset_size: 51817025617.879
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OpenNiji-275001_310000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/p38_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of p38/P38/P38 (Girls' Frontline)
This is the dataset of p38/P38/P38 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `brown_hair, hat, garrison_cap, military_hat, long_hair, purple_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 6.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 5.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 10.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 6.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 12.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p38_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, military_uniform, solo, belt, white_background, handgun, iron_cross, jacket, open_mouth, black_skirt, boots, holding_gun, holster, looking_at_viewer, simple_background, thighhighs, collared_shirt, pleated_skirt, pouch, walther |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | military_uniform | solo | belt | white_background | handgun | iron_cross | jacket | open_mouth | black_skirt | boots | holding_gun | holster | looking_at_viewer | simple_background | thighhighs | collared_shirt | pleated_skirt | pouch | walther |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:-------|:-------|:-------------------|:----------|:-------------|:---------|:-------------|:--------------|:--------|:--------------|:----------|:--------------------|:--------------------|:-------------|:-----------------|:----------------|:--------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
TKNodven/Mordred | ---
language:
- ja
--- |
tiennv/vietnamese-news | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 28837412151
num_examples: 12573213
download_size: 15141327938
dataset_size: 28837412151
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vietnamese-news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_260 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 919207916.0
num_examples: 179113
download_size: 939322608
dataset_size: 919207916.0
---
# Dataset Card for "chunk_260"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SLPL/naab-raw | ---
language:
- fa
license:
- mit
multilinguality:
- monolingual
task_categories:
- fill-mask
- text-generation
task_ids:
- language-modeling
- masked-language-modeling
pretty_name: naab-raw (raw version of the naab corpus)
---
# naab-raw (raw version of the naab corpus)
_[If you want to join our community to keep up with news, models and datasets from naab, click on [this](https://docs.google.com/forms/d/e/1FAIpQLSe8kevFl_ODCx-zapAuOIAQYr8IvkVVaVHOuhRL9Ha0RVJ6kg/viewform) link.]_
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Changelog](#changelog)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Contribution Guideline](#contribution-guideline)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Sharif Speech and Language Processing Lab](https://huggingface.co/SLPL)
- **Paper:** [naab: A ready-to-use plug-and-play corpus for Farsi](https://arxiv.org/abs/2208.13486)
- **Point of Contact:** [Sadra Sabouri](mailto:sabouri.sadra@gmail.com)
### Dataset Summary
This is the raw (uncleaned) version of the [naab](https://huggingface.co/datasets/SLPL/naab) corpus. You can use also customize our [preprocess script](https://github.com/Sharif-SLPL/t5-fa/tree/main/preprocess) and make your own cleaned corpus. This repository is a hub for all Farsi corpora. Feel free to add your corpus following the [contribution guidelines](#contribution-guideline).
You can download the dataset by the command below:
```python
from datasets import load_dataset
dataset = load_dataset("SLPL/naab-raw")
```
If you wanted to download a specific part of the corpus you can set the config name to the specific corpus name:
```python
from datasets import load_dataset
dataset = load_dataset("SLPL/naab-raw", "CC-fa")
```
### Supported Tasks and Leaderboards
This corpus can be used for training all language models trained by Masked Language Modeling (MLM) or any other self-supervised objective.
- `language-modeling`
- `masked-language-modeling`
### Changelog
It's crucial to log changes on the projects which face changes periodically. Please refer to the [CHANGELOG.md](https://huggingface.co/datasets/SLPL/naab-raw/blob/main/CHANGELOG.md) for more details.
## Dataset Structure
Each row of the dataset will look like something like the below:
```json
{
'text': "این یک تست برای نمایش یک پاراگراف در پیکره متنی ناب است.",
}
```
+ `text` : the textual paragraph.
### Data Splits
This corpus contains only a split (the `train` split).
## Dataset Creation
### Curation Rationale
Here are some details about each part of this corpus.
#### CC-fa
The Common Crawl corpus contains petabytes of data collected since 2008. It contains raw web page data, extracted metadata, and text extractions. We use the Farsi part of it here.
#### W2C
The W2C stands for Web to Corpus and it contains several corpera. We contain the Farsi part of it in this corpus.
### Contribution Guideline
In order to add your dataset, you should follow the below steps and make a pull request in order to be merged with the _naab-raw_:
1. Add your dataset to `_CORPUS_URLS` in `naab-raw.py` like:
```python
...
"DATASET_NAME": "LINK_TO_A_PUBLIC_DOWNLOADABLE_FILE.txt"
...
```
2. Add a log of your changes to the [CHANGELOG.md](https://huggingface.co/datasets/SLPL/naab-raw/blob/main/CHANGELOG.md).
3. Add some minor descriptions to the [Curation Rationale](#curation-rationale) under a subsection with your dataset name.
### Personal and Sensitive Information
Since this corpus is briefly a compilation of some former corpora we take no responsibility for personal information included in this corpus. If you detect any of these violations please let us know, we try our best to remove them from the corpus ASAP.
We tried our best to provide anonymity while keeping the crucial information. We shuffled some parts of the corpus so the information passing through possible conversations wouldn't be harmful.
## Additional Information
### Dataset Curators
+ Sadra Sabouri (Sharif University of Technology)
+ Elnaz Rahmati (Sharif University of Technology)
### Licensing Information
mit
### Citation Information
```
@article{sabouri2022naab,
title={naab: A ready-to-use plug-and-play corpus for Farsi},
author={Sabouri, Sadra and Rahmati, Elnaz and Gooran, Soroush and Sameti, Hossein},
journal={arXiv preprint arXiv:2208.13486},
year={2022}
}
```
DOI:[https://doi.org/10.48550/arXiv.2208.13486](https://doi.org/10.48550/arXiv.2208.13486).
### Contributions
Thanks to [@sadrasabouri](https://github.com/sadrasabouri) and [@elnazrahmati](https://github.com/elnazrahmati) for adding this dataset.
### Keywords
+ Farsi
+ Persian
+ raw text
+ پیکره فارسی
+ پیکره متنی
+ آموزش مدل زبانی
|
CyberHarem/rurutie_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of rurutie/ルルティエ/露露缇耶 (Azur Lane)
This is the dataset of rurutie/ルルティエ/露露缇耶 (Azur Lane), containing 72 images and their tags.
The core tags of this character are `long_hair, black_hair, bow, braid, hair_bow, red_eyes, animal_ears, twin_braids, tail, brown_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 72 | 108.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rurutie_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 72 | 59.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rurutie_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 160 | 116.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rurutie_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 72 | 92.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rurutie_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 160 | 166.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rurutie_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rurutie_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------|
| 0 | 35 |  |  |  |  |  | 1girl, solo, looking_at_viewer, japanese_clothes, wide_sleeves, long_sleeves, blush, bangs, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | japanese_clothes | wide_sleeves | long_sleeves | blush | bangs | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------------|:---------------|:---------------|:--------|:--------|:--------|
| 0 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
Thauab/voice785 | ---
license: openrail
---
|
angelika/CRAB | ---
task_categories:
- text-classification
- multiple-choice
language:
- en
size_categories:
- 1K<n<10K
---
# CRAB: Causal Reasoning Assessment Benchmark
## Dataset Details
## Dataset Creation
## Splits - Tasks
### Pairwise Causality Assessment
### Graded Causality Assessment
## Citation
To cite 🦀 CRAB, please use:
```
@inproceedings{romanou2023crab,
title={CRAB: Assessing the Strength of Causal Relationships Between Real-world Events},
author={Angelika Romanou and Syrielle Montariol and Debjit Paul and Leo Laugier and Karl Aberer and Antoine Bosselut},
year={2023},
eprint={2311.04284},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Calad/fake-TGC | ---
license: apache-2.0
---
|
bigscience-data/roots_indic-ur_wikiquote | ---
language: ur
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-ur_wikiquote
# wikiquote_filtered
- Dataset uid: `wikiquote_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0462 % of total
- 0.1697 % of en
- 0.0326 % of fr
- 0.0216 % of ar
- 0.0066 % of zh
- 0.0833 % of pt
- 0.0357 % of es
- 0.0783 % of indic-ta
- 0.0361 % of indic-hi
- 0.0518 % of ca
- 0.0405 % of vi
- 0.0834 % of indic-ml
- 0.0542 % of indic-te
- 0.1172 % of indic-gu
- 0.0634 % of indic-kn
- 0.0539 % of id
- 0.0454 % of indic-ur
- 0.0337 % of indic-mr
- 0.0347 % of eu
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-gu
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-kn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
|
maidalun1020/CrosslingualRetrievalLawEn2Zh | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 5832742
num_examples: 26642
- name: corpus
num_bytes: 4789805
num_examples: 4899
download_size: 6283345
dataset_size: 10622547
---
|
KerVerse/Amharic_Stories | ---
license: apache-2.0
---
|
IlyaGusev/ru_sharegpt_cleaned | ---
language:
- ru
size_categories:
- n<1K
task_categories:
- conversational
- text-generation
dataset_info:
features:
- name: messages
sequence:
- name: role
dtype: string
- name: content
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1993005
num_examples: 273
download_size: 2054401
dataset_size: 1993005
---
|
legacy107/cpgQA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: answer
dtype: string
- name: answer_start
dtype: int64
- name: question
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1259359
num_examples: 987
- name: test
num_bytes: 143518
num_examples: 110
download_size: 232065
dataset_size: 1402877
---
# Dataset Card for "cpgQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/runne_prompts | ---
dataset_info:
features:
- name: text
dtype: string
- name: parsed_entities
dtype: string
splits:
- name: train
num_bytes: 2636744
num_examples: 537
download_size: 1142735
dataset_size: 2636744
---
# Dataset Card for "runne_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__opt125m_10e4 | ---
pretty_name: Evaluation run of BFauber/opt125m_10e4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/opt125m_10e4](https://huggingface.co/BFauber/opt125m_10e4) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T18:39:22.964015](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e4/blob/main/results_2024-02-02T18-39-22.964015.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2653997971375558,\n\
\ \"acc_stderr\": 0.03091751185138889,\n \"acc_norm\": 0.2667186188669961,\n\
\ \"acc_norm_stderr\": 0.03173676795406758,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.4287951061339734,\n\
\ \"mc2_stderr\": 0.014935297274427089\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20648464163822525,\n \"acc_stderr\": 0.011828865619002316,\n\
\ \"acc_norm\": 0.2295221843003413,\n \"acc_norm_stderr\": 0.012288926760890797\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2877912766381199,\n\
\ \"acc_stderr\": 0.004518080594528024,\n \"acc_norm\": 0.3090021907986457,\n\
\ \"acc_norm_stderr\": 0.004611377019520813\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.0285048564705142,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.0285048564705142\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727772,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727772\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"\
acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \
\ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121633,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121633\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3376146788990826,\n \"acc_stderr\": 0.020275265986638907,\n \"\
acc_norm\": 0.3376146788990826,\n \"acc_norm_stderr\": 0.020275265986638907\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.1940928270042194,\n \"acc_stderr\": 0.025744902532290916,\n\
\ \"acc_norm\": 0.1940928270042194,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1210762331838565,\n\
\ \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.1210762331838565,\n\
\ \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\
acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n\
\ \"acc_stderr\": 0.03405702838185694,\n \"acc_norm\": 0.15178571428571427,\n\
\ \"acc_norm_stderr\": 0.03405702838185694\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.04721188506097173,\n\
\ \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.04721188506097173\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20306513409961685,\n\
\ \"acc_stderr\": 0.014385525076611581,\n \"acc_norm\": 0.20306513409961685,\n\
\ \"acc_norm_stderr\": 0.014385525076611581\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819743,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819743\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.4287951061339734,\n\
\ \"mc2_stderr\": 0.014935297274427089\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616448\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/opt125m_10e4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-39-22.964015.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- '**/details_harness|winogrande|5_2024-02-02T18-39-22.964015.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T18-39-22.964015.parquet'
- config_name: results
data_files:
- split: 2024_02_02T18_39_22.964015
path:
- results_2024-02-02T18-39-22.964015.parquet
- split: latest
path:
- results_2024-02-02T18-39-22.964015.parquet
---
# Dataset Card for Evaluation run of BFauber/opt125m_10e4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e4](https://huggingface.co/BFauber/opt125m_10e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:39:22.964015](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e4/blob/main/results_2024-02-02T18-39-22.964015.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2653997971375558,
"acc_stderr": 0.03091751185138889,
"acc_norm": 0.2667186188669961,
"acc_norm_stderr": 0.03173676795406758,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.4287951061339734,
"mc2_stderr": 0.014935297274427089
},
"harness|arc:challenge|25": {
"acc": 0.20648464163822525,
"acc_stderr": 0.011828865619002316,
"acc_norm": 0.2295221843003413,
"acc_norm_stderr": 0.012288926760890797
},
"harness|hellaswag|10": {
"acc": 0.2877912766381199,
"acc_stderr": 0.004518080594528024,
"acc_norm": 0.3090021907986457,
"acc_norm_stderr": 0.004611377019520813
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.0285048564705142,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.0285048564705142
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727772,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727772
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121633,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121633
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638907,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.1940928270042194,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.1940928270042194,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1210762331838565,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.1210762331838565,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.03405702838185694,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.03405702838185694
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.04721188506097173,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.04721188506097173
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20306513409961685,
"acc_stderr": 0.014385525076611581,
"acc_norm": 0.20306513409961685,
"acc_norm_stderr": 0.014385525076611581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819743,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819743
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.4287951061339734,
"mc2_stderr": 0.014935297274427089
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616448
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Yijia-Xiao/alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: string
splits:
- name: eval
num_bytes: 640299
num_examples: 805
download_size: 329403
dataset_size: 640299
---
# Dataset Card for "alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_squad_rare_tip_train_100_eval_100 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 81184
num_examples: 300
- name: train_doc2id
num_bytes: 36110
num_examples: 200
- name: train_id2doc
num_bytes: 36710
num_examples: 200
- name: train_find_word
num_bytes: 44474
num_examples: 100
- name: eval_find_word
num_bytes: 27815
num_examples: 100
- name: id_context_mapping
num_bytes: 30310
num_examples: 200
download_size: 165444
dataset_size: 256603
---
# Dataset Card for "fwv2_squad_rare_tip_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hmao/rule_learning_data_v0_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: task_name
dtype: string
- name: filepath
dtype: string
- name: rule
dtype: string
- name: description
dtype: string
- name: configuration
dtype: string
splits:
- name: train
num_bytes: 203602
num_examples: 100
download_size: 97940
dataset_size: 203602
---
# Dataset Card for "rule_learning_data_v0_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aalexchengg/jp_ner_ws | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
dtype: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 3042209
num_examples: 5339
download_size: 770166
dataset_size: 3042209
---
# Dataset Card for "jp_ner_ws"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/bc081991 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1340
dataset_size: 184
---
# Dataset Card for "bc081991"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dkshjn/chatdoctor-200k-stripped-embedded-v2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: embeddings_input
sequence: float64
- name: embeddings_output
sequence: float64
splits:
- name: s1
num_bytes: 70815
num_examples: 10
download_size: 73079
dataset_size: 70815
configs:
- config_name: default
data_files:
- split: s1
path: data/s1-*
---
|
axel-rda/salary_extraction_ft_dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1250380
num_examples: 216
- name: test
num_bytes: 231719
num_examples: 39
download_size: 531443
dataset_size: 1482099
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
pouya-haghi/imagenet-1k | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 43931424.0
num_examples: 1024
download_size: 43911764
dataset_size: 43931424.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ahmed107/q_Sample | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 1112508.0
num_examples: 8
download_size: 1114606
dataset_size: 1112508.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
flamesbob/Dark_fantasy | ---
license: creativeml-openrail-m
---
|
Bingpot/bundestag | ---
license: cc0-1.0
---
|
sujra/mini-insurance | ---
dataset_info:
features:
- name: 'instruction '
dtype: string
- name: output
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 70922
num_examples: 96
download_size: 36386
dataset_size: 70922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adriont/ivanlima | ---
license: openrail
---
|
open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B-Alt | ---
pretty_name: Evaluation run of smelborp/MixtralOrochi8x7B-Alt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [smelborp/MixtralOrochi8x7B-Alt](https://huggingface.co/smelborp/MixtralOrochi8x7B-Alt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B-Alt\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T16:36:26.301610](https://huggingface.co/datasets/open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B-Alt/blob/main/results_2023-12-29T16-36-26.301610.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6897102628842678,\n\
\ \"acc_stderr\": 0.03039342739788087,\n \"acc_norm\": 0.7029575662168503,\n\
\ \"acc_norm_stderr\": 0.03120937495626396,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538772,\n \"mc2\": 0.6403236854599645,\n\
\ \"mc2_stderr\": 0.01510362269809065\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283505,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946535\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6770563632742481,\n\
\ \"acc_stderr\": 0.004666457279979415,\n \"acc_norm\": 0.86247759410476,\n\
\ \"acc_norm_stderr\": 0.00343694164178278\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948614,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948614\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n\
\ \"acc_stderr\": 0.03456425745086998,\n \"acc_norm\": 0.7109826589595376,\n\
\ \"acc_norm_stderr\": 0.03456425745086998\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.04940635630605659,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.04940635630605659\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.02575094967813038,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.02575094967813038\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252255,\n \"\
acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252255\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5911330049261084,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.5911330049261084,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240524,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240524\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882385,\n\
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882385\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8733944954128441,\n \"acc_stderr\": 0.014257128686165169,\n \"\
acc_norm\": 0.8733944954128441,\n \"acc_norm_stderr\": 0.014257128686165169\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n\
\ \"acc_stderr\": 0.02838039114709471,\n \"acc_norm\": 0.7668161434977578,\n\
\ \"acc_norm_stderr\": 0.02838039114709471\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.031722334260021585,\n \"\
acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.031722334260021585\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625852,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625852\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8812260536398467,\n\
\ \"acc_stderr\": 0.011569134791715655,\n \"acc_norm\": 0.8812260536398467,\n\
\ \"acc_norm_stderr\": 0.011569134791715655\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4670391061452514,\n\
\ \"acc_stderr\": 0.016686126653013934,\n \"acc_norm\": 0.4670391061452514,\n\
\ \"acc_norm_stderr\": 0.016686126653013934\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.023420375478296132,\n\
\ \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.023420375478296132\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.02133086876212706,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.02133086876212706\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5234680573663625,\n\
\ \"acc_stderr\": 0.012756161942523346,\n \"acc_norm\": 0.5234680573663625,\n\
\ \"acc_norm_stderr\": 0.012756161942523346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.02604066247420125,\n\
\ \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.02604066247420125\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7320261437908496,\n \"acc_stderr\": 0.017917974069594722,\n \
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.017917974069594722\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018533,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018533\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.017433490102538772,\n \"mc2\": 0.6403236854599645,\n\
\ \"mc2_stderr\": 0.01510362269809065\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.01123532838262585\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/smelborp/MixtralOrochi8x7B-Alt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|arc:challenge|25_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|gsm8k|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hellaswag|10_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T16-36-26.301610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T16-36-26.301610.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- '**/details_harness|winogrande|5_2023-12-29T16-36-26.301610.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T16-36-26.301610.parquet'
- config_name: results
data_files:
- split: 2023_12_29T16_36_26.301610
path:
- results_2023-12-29T16-36-26.301610.parquet
- split: latest
path:
- results_2023-12-29T16-36-26.301610.parquet
---
# Dataset Card for Evaluation run of smelborp/MixtralOrochi8x7B-Alt
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [smelborp/MixtralOrochi8x7B-Alt](https://huggingface.co/smelborp/MixtralOrochi8x7B-Alt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B-Alt",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T16:36:26.301610](https://huggingface.co/datasets/open-llm-leaderboard/details_smelborp__MixtralOrochi8x7B-Alt/blob/main/results_2023-12-29T16-36-26.301610.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6897102628842678,
"acc_stderr": 0.03039342739788087,
"acc_norm": 0.7029575662168503,
"acc_norm_stderr": 0.03120937495626396,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538772,
"mc2": 0.6403236854599645,
"mc2_stderr": 0.01510362269809065
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283505,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946535
},
"harness|hellaswag|10": {
"acc": 0.6770563632742481,
"acc_stderr": 0.004666457279979415,
"acc_norm": 0.86247759410476,
"acc_norm_stderr": 0.00343694164178278
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948614,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948614
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.03456425745086998,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.03456425745086998
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813038,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813038
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252255,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5911330049261084,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.5911330049261084,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240524,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240524
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.027025433498882385,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.027025433498882385
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849928,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849928
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8733944954128441,
"acc_stderr": 0.014257128686165169,
"acc_norm": 0.8733944954128441,
"acc_norm_stderr": 0.014257128686165169
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.02838039114709471,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.02838039114709471
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.031722334260021585,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.031722334260021585
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625852,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625852
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8812260536398467,
"acc_stderr": 0.011569134791715655,
"acc_norm": 0.8812260536398467,
"acc_norm_stderr": 0.011569134791715655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4670391061452514,
"acc_stderr": 0.016686126653013934,
"acc_norm": 0.4670391061452514,
"acc_norm_stderr": 0.016686126653013934
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.023420375478296132,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.023420375478296132
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.02133086876212706,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.02133086876212706
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5234680573663625,
"acc_stderr": 0.012756161942523346,
"acc_norm": 0.5234680573663625,
"acc_norm_stderr": 0.012756161942523346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.02604066247420125,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.02604066247420125
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.017917974069594722,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.017917974069594722
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018533,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018533
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.017433490102538772,
"mc2": 0.6403236854599645,
"mc2_stderr": 0.01510362269809065
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.01123532838262585
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jahb57/gpt2_embeddings_BATCH_14 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
splits:
- name: train
num_bytes: 18580842836
num_examples: 100000
download_size: 18629530835
dataset_size: 18580842836
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SINAI/NECOS | ---
license: cc-by-nc-sa-4.0
language:
- es
pretty_name: NECOS
---
### Title:
NECOS: An annotated corpus to identify constructive news comments in Spanish
### Dataset Description
**Paper**: [NECOS: An annotated corpus to identify constructive news comments in Spanish](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/download/6321/3750)
**Point of Contact**: flor.plaza@unibocconi.it, maite@ujaen.es
NEws and COmments in Spanish (NECOS) is a collection of Spanish comments posted in response to newspaper articles. Following a robust annotation scheme, three annotators labeled the comments as constructive and non-constructive. The articles were published in the newspaper El Mundo between April 3rd and April 30th, 2018.
The corpus is composed of a total of 10 news articles and 1,419 comments. Three annotators manually labeled NECOS with an average Cohen’s kappa of 78.97.
### Licensing Information
NECOS is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```bibtex
@article{lopez2021necos,
title={NECOS: An annotated corpus to identify constructive news comments in Spanish},
author={L{\'o}pez-{\'U}beda, Pilar and Plaza-del-Arco, Flor Miriam and D{\'\i}az-Galiano, Manuel Carlos and Mart{\'\i}n-Valdivia, M Teresa},
journal={Procesamiento del Lenguaje Natural},
volume={66},
pages={41--51},
year={2021}
}
``` |
legacy107/newsqa-chunked-50 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: key
dtype: string
- name: labels
list:
- name: end
sequence: int64
- name: start
sequence: int64
- name: document_id
dtype: int64
- name: chunks
sequence: string
splits:
- name: train
num_bytes: 608073207
num_examples: 69960
- name: validation
num_bytes: 37377549
num_examples: 4200
- name: test
num_bytes: 36416017
num_examples: 4212
download_size: 59816869
dataset_size: 681866773
---
# Dataset Card for "newsqa-chunked-50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BigTMiami/amazon_21M_reviews_part_2_of_6_643K | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 719303832
num_examples: 107874
- name: validation
num_bytes: 55391076
num_examples: 8307
download_size: 246901393
dataset_size: 774694908
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
mwz/urdu-speech | ---
license: apache-2.0
task_categories:
- automatic-speech-recognition
dataset_info:
features:
- name: CHANNEL_NAME
dtype: 'null'
- name: URL
dtype: 'null'
- name: TITLE
dtype: 'null'
- name: DESCRIPTION
dtype: 'null'
- name: TRANSCRIPTION
dtype: 'null'
- name: SEGMENTS
dtype: 'null'
- name: __index_level_0__
dtype: 'null'
splits:
- name: train
download_size: 1797
dataset_size: 0
tags:
- whisper
- whispering
- large
---
|
Deepjyoti120/AssameseDataTrain | ---
license: artistic-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.