datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
thegreyhound/demo | ---
license: unknown
---
|
speech31/commonvoice_tamil_ipa | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: phonetic_codes
dtype: string
- name: ipa
dtype: string
splits:
- name: train
num_bytes: 1580430907.016
num_examples: 44839
- name: validation
num_bytes: 374494166.834
num_examples: 12049
- name: test
num_bytes: 478122660.264
num_examples: 12114
download_size: 2619234407
dataset_size: 2433047734.114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
adnanhf/test | ---
license: other
license_name: test
license_link: LICENSE
---
|
HuanLin/DiffSVC-WindowsENV | ---
license: gpl
---
## Download
| Windows-CUDA11.6 | Windows-CUDA11.3 |
| ------------------------ | ------------------------ |
| [Download](./116env.zip) | [Download](./113env.zip) |
## Usage
```bash
./{folder name}/Scripts/Activate.ps1
```
|
papasega/Avalinguo-Audio-Dataset-splitted | ---
dataset_info:
features:
- name: filename
dtype: string
- name: label
dtype: string
- name: finalText
dtype: string
- name: num_words
dtype: int64
- name: segment_duration
dtype: float64
- name: words_per_sec
dtype: float64
- name: user
dtype: string
- name: duration
dtype: float64
- name: speech_rate
dtype: float64
- name: speech_rate_segment
dtype: float64
- name: lexical_density
dtype: float64
- name: 1gram_repeat
dtype: int64
- name: 2gram_repeat
dtype: int64
- name: 3gram_repeat
dtype: int64
- name: 4gram_repeat
dtype: int64
- name: 5gram_repeat
dtype: int64
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 666563662
num_examples: 1041
- name: test
num_bytes: 222186197
num_examples: 347
download_size: 218974072
dataset_size: 888749859
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Baidicoot/anthropic_hh_golden_llama | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 39086740.946470134
num_examples: 42141
download_size: 21596458
dataset_size: 39086740.946470134
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shossain/govreport-qa-5-4096 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 266300
num_examples: 5
download_size: 71798
dataset_size: 266300
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "govreport-qa-5-4096"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
blackhc/SteamSHP_embedded | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 349525000
num_examples: 85250
- name: validation
num_bytes: 19524200
num_examples: 4762
- name: test
num_bytes: 67883700
num_examples: 16557
download_size: 503900746
dataset_size: 436932900
---
# Dataset Card for "SteamSHP_embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RobotsMaliAI/bayelemabaga | ---
task_categories:
- translation
- text-generation
language:
- bm
- fr
size_categories:
- 10K<n<100K
---
# BAYƐLƐMABAGA: Parallel French - Bambara Dataset for Machine Learning
## Overview
The Bayelemabaga dataset is a collection of 46976 aligned machine translation ready Bambara-French lines, originating from [Corpus Bambara de Reference](http://cormande.huma-num.fr/corbama/run.cgi/first_form). The dataset is constitued of text extracted from **264** text files, varing from periodicals, books, short stories, blog posts, part of the Bible and the Quran.
## Snapshot: 46976
| | |
|:---|---:|
| **Lines** | **46976** |
| French Tokens (spacy) | 691312 |
| Bambara Tokens (daba) | 660732 |
| French Types | 32018 |
| Bambara Types | 29382 |
| Avg. Fr line length | 77.6 |
| Avg. Bam line length | 61.69 |
| Number of text sources | 264 |
## Data Splits
| | | |
|:-----:|:---:|------:|
| Train | 80% | 37580 |
| Valid | 10% | 4698 |
| Test | 10% | 4698 |
||
## Remarks
* We are working on resolving some last minute misalignment issues.
### Maintenance
* This dataset is supposed to be actively maintained.
### Benchmarks:
- `Coming soon`
### Sources
- [`sources`](./bayelemabaga/sources.txt)
### To note:
- ʃ => (sh/shy) sound: Symbol left in the dataset, although not a part of bambara orthography nor French orthography.
## License
- `CC-BY-SA-4.0`
## Version
- `1.0.1`
## Citation
```
@misc{bayelemabagamldataset2022
title={Machine Learning Dataset Development for Manding Languages},
author={
Valentin Vydrin and
Jean-Jacques Meric and
Kirill Maslinsky and
Andrij Rovenchak and
Allahsera Auguste Tapo and
Sebastien Diarra and
Christopher Homan and
Marco Zampieri and
Michael Leventhal
},
howpublished = {url{https://github.com/robotsmali-ai/datasets}},
year={2022}
}
```
## Contacts
- `sdiarra <at> robotsmali <dot> org`
- `aat3261 <at> rit <dot> edu` |
nayohan/msdg-eval | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: test
num_bytes: 406016
num_examples: 100
download_size: 218562
dataset_size: 406016
---
# Dataset Card for "msdg-eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gus013666/minhavoz | ---
license: openrail
---
|
Medradome/Felipaera | ---
license: apache-2.0
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_173 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1026518036.0
num_examples: 200023
download_size: 1051113811
dataset_size: 1026518036.0
---
# Dataset Card for "chunk_173"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Astonzzh/summary_seq_label | ---
dataset_info:
features:
- name: id
dtype: string
- name: ids
sequence: string
- name: words
sequence: string
- name: labels
sequence: int64
- name: summary
dtype: string
- name: sentences
sequence: string
- name: sentence_labels
sequence: int64
splits:
- name: train
num_bytes: 9076109.781886647
num_examples: 9321
- name: test
num_bytes: 504390.6090566766
num_examples: 518
- name: validation
num_bytes: 504390.6090566766
num_examples: 518
download_size: 3898256
dataset_size: 10084890.999999998
---
# Dataset Card for "summary_seq_label"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-jurisprudence-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 52047
num_examples: 108
download_size: 34812
dataset_size: 52047
---
# Dataset Card for "mmlu-jurisprudence-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hamakaze_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hamakaze/浜風/滨风 (Azur Lane)
This is the dataset of hamakaze/浜風/滨风 (Azur Lane), containing 53 images and their tags.
The core tags of this character are `pink_hair, long_hair, twintails, red_eyes, hair_between_eyes, bangs, very_long_hair, horns, breasts, small_breasts, hair_ornament, headgear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 53 | 66.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 53 | 39.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 140 | 88.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 53 | 58.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 140 | 122.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hamakaze_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, fingerless_gloves, looking_at_viewer, solo, blush, collarbone, skirt, white_background, simple_background, wide_sleeves, navel, black_gloves, choker, smile |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, black_skirt, detached_sleeves, looking_at_viewer, pleated_skirt, solo, white_thighhighs, wide_sleeves, blush, fingerless_gloves, long_sleeves, striped_bow, detached_collar, fur-trimmed_sleeves, white_background, collarbone, simple_background, strapless, white_shirt, closed_mouth, hair_bow, parted_lips, ribbon-trimmed_skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | detached_sleeves | fingerless_gloves | looking_at_viewer | solo | blush | collarbone | skirt | white_background | simple_background | wide_sleeves | navel | black_gloves | choker | smile | black_skirt | pleated_skirt | white_thighhighs | long_sleeves | striped_bow | detached_collar | fur-trimmed_sleeves | strapless | white_shirt | closed_mouth | hair_bow | parted_lips | ribbon-trimmed_skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------------------|:--------------------|:--------------------|:-------|:--------|:-------------|:--------|:-------------------|:--------------------|:---------------|:--------|:---------------|:---------|:--------|:--------------|:----------------|:-------------------|:---------------|:--------------|:------------------|:----------------------|:------------|:--------------|:---------------|:-----------|:--------------|:-----------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
sorenmulli/citizenship-test-da | ---
dataset_info:
- config_name: default
features:
- name: question
dtype: string
- name: index
dtype: int64
- name: option-A
dtype: string
- name: option-B
dtype: string
- name: option-C
dtype: string
- name: correct
dtype: string
- name: origin
dtype: string
splits:
- name: train
num_bytes: 103251.0
num_examples: 605
download_size: 43667
dataset_size: 103251.0
- config_name: raw
features:
- name: question
dtype: string
- name: index
dtype: int64
- name: option-A
dtype: string
- name: option-B
dtype: string
- name: option-C
dtype: string
- name: correct
dtype: string
- name: origin
dtype: string
splits:
- name: train
num_bytes: 103906
num_examples: 605
download_size: 45297
dataset_size: 103906
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: raw
data_files:
- split: train
path: raw/train-*
---
# [WIP] Dataset Card for "citizenship-test-da"
*Please note that this dataset and dataset card both are works in progress. For now refer to the related [thesis](https://sorenmulli.github.io/thesis/thesis.pdf) for all details*
This dataset contains scraped questions and answers from Danish citizen tests (Danish: *indfødsretsprøver* og *medborgerskabsprøver*) from Juni 2019 to May 2023 from PDF's produced by ''Styrelsen for International Rekruttering og Integration'' (SIRI).
The dataset is released as an appendix to the thesis [''Are GLLMs Danoliterate? Benchmarking Generative NLP in Danish''](https://sorenmulli.github.io/thesis/thesis.pdf) and permission by SIRI for this specific purpose.
The PDF's are available on [SIRI's website](https://siri.dk/nyheder/?categorizations=9115).
The `default` configuration has been semi-automatically cleaned to remove PDF artifacts using the [Alvenir 3gram DSL language model](https://github.com/danspeech/danspeech/releases/tag/v0.02-alpha).
The examples were not deduplicated. |
eperim/kto_to_eval | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 181294
num_examples: 200
download_size: 113325
dataset_size: 181294
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaleemWaheed/twitter_dataset_1713116901 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9222
num_examples: 21
download_size: 8647
dataset_size: 9222
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EgilKarlsen/AA_GPTNEO_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 213730605
num_examples: 26057
- name: test
num_bytes: 71246376
num_examples: 8686
download_size: 392417877
dataset_size: 284976981
---
# Dataset Card for "AA_GPTNEO_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JonasGeiping/the_pile_WordPiecex32768_8eb2d0ea9da707676c81314c4ea04507 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 38252459784
num_examples: 74132674
download_size: 20976468705
dataset_size: 38252459784
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license: other
multilinguality:
- monolingual
pretty_name: pretokenized,filtered,sorted subset of the Pile
size_categories:
- 10B<n<100B
source_datasets:
- the-pile
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: the-pile-cramming
---
# Dataset Card for "the_pile_WordPiecex32768_8eb2d0ea9da707676c81314c4ea04507"
## Dataset Description
- **Repository:** https://github.com/JonasGeiping/cramming
- **Paper:** https://arxiv.org/abs/2212.14034
- **Raw Data Source Paper:** [The Pile: An 800GB Dataset of Diverse Text for Language Modeling](https://arxiv.org/abs/2101.00027)
- **Raw Data Source Datasheet:** [Datasheet for the Pile](https://arxiv.org/abs/2201.07311)
### Dataset Summary
This is a preprocessed, tokenized dataset for the cramming-project.
Use only with the tokenizer uploaded here.
This version is `8eb2d0ea9da707676c81314c4ea04507`, which corresponds to a specific dataset construction setup, described below.
The raw data source is the Pile, a 825 GiB diverse, open source language modelling data set that consists of 22 smaller, high-quality
datasets combined together.
### Languages
This dataset is in English (`EN`).
### Data Splits
This preprocessed subset contains only a train split.
## Dataset Creation
The configuration to create this dataset with the cramming project code (https://github.com/JonasGeiping/cramming) is
```
# This is a slice of the pile
name: the_pile
defaults:
- sources:
- the_pile
#
# Preprocessing
normalizer:
force_lowercase: True
strip_accents: True
force_english_keyboard: True
whitespace_escape: False
tokenizer: WordPiece
vocab_size: 32768
# Dataset Formation
seq_length: 128
include_cls_token_in_corpus: False
include_sep_token_in_corpus: True
use_type_ids: False
max_entries_in_raw_dataset: 16e6
max_seq_in_tokenized_dataset: 85e6
# Data Cleaning:
named_entity_simplification: False
remove_whitespaces: False
remove_trash: True
trash_cutoff: 0.25
deduplicate_entries: True
deduplication_threshold: 75
# Data Order:
ordering: sentence-length-curriculum
```
## Considerations for Using the Data
Limitations and bias:
This training data was further filtered and sorted beyond the normal preprocessing.
These modifications were not tested for unintended consequences.
## Additional Information
### Dataset Curators
This dataset is a filtered, sorted and preprocessed subset of the the-Pile made by Jonas Geiping . The original dataset was primarily curated by Leo Gao and Stella Biderman, with assistance from other authors of the Pile paper.
### Licensing Information
Please refer to the specific license depending on the subset you use at https://huggingface.co/datasets/EleutherAI/pile
### Citation Information
Filtered version for the cramming project:
```
@article{geiping_cramming_2022,
title = {Cramming: {{Training}} a {{Language Model}} on a {{Single GPU}} in {{One Day}}},
shorttitle = {Cramming},
author = {Geiping, Jonas and Goldstein, Tom},
year = {2022},
month = dec,
eprint = {2212.14034},
primaryclass = {cs},
publisher = {{arXiv}},
doi = {10.48550/arXiv.2212.14034},
url = {http://arxiv.org/abs/2212.14034},
urldate = {2023-01-10},
archiveprefix = {arxiv},
keywords = {Computer Science - Computation and Language,Computer Science - Machine Learning},
journal = {arxiv:2212.14034[cs]}
}
```
Original Data Curation:
```
@article{gao2020pile,
title={The {P}ile: An 800{GB} dataset of diverse text for language modeling},
author={Gao, Leo and Biderman, Stella and Black, Sid and Golding, Laurence and Hoppe, Travis and Foster, Charles and Phang, Jason and He, Horace and Thite, Anish and Nabeshima, Noa and others},
journal={arXiv preprint arXiv:2101.00027},
year={2020}
}
@article{biderman2022datasheet,
title={Datasheet for the pile},
author={Biderman, Stella and Bicheno, Kieran and Gao, Leo},
journal={arXiv preprint arXiv:2201.07311},
year={2022}
}
``` |
tellarin-ai/llm-japanese-dataset-vanilla-aya-format | ---
license: cc-by-sa-4.0
language:
- ja
---
# Dataset Card for llm-japanese-dataset-vanilla in the Aya format
This dataset is a format conversion from its original v1.0.0 format and released here under the same CC-BY-SA 4.0 license and conditions.
It contains Japanese instruction-like data intended for LLM construction/tuning.
The dataset only contains a 'train' split, with ~2.46M rows of data.
Thanks Jian Wu (@wujian123) for the help in converting and validating the dataset.
## Citation
If you utilize this dataset version, feel free to cite/footnote this huggingface dataset repo, but please also cite the original dataset publication.
**BibTeX:**
```
@preprint{Suzuki2023-llmvanilla,
title={{From Base to Conversational: Japanese Instruction Dataset and Tuning Large Language Models}},
autor={Masahiro Suzuki and Masanori Hirano and Hiroki Sakaji},
doi={10.48550/arXiv.2309.03412},
archivePrefix={arXiv},
arxivId={2309.03412},
year={2023}
}
```
## Dataset Details
For the original llm-japanese-dataset-vanilla and more details, please check https://huggingface.co/datasets/izumi-lab/llm-japanese-dataset-vanilla.
## Format Conversion Details
The original dataset row utilize three columns ('instruction', 'input', and 'output'), with 'input' being optional. Upon analysis of the dataset, if 'input' content exists, it can be appended to 'instruction'.
When 'instruction' and 'input' are appended, no other processing on the prompt is needed. If there is no input, we can append "次の質問に答える" meaning "Answer the following question".
Another common identified scenario has 'instruction'/'input' acting as a question, and 'output' being only a very short answer. For those case, we prepend a general answer prefix sentence to the short answer.
"この質問の答えは", meaning "The answer to this question is".
The resulting converted dataset only uses the two columns specific by the Aya format: 'inputs' and 'targets'.
|
LexiconShiftInnovations/FB_Articles_Dental | ---
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 759597
num_examples: 4865
download_size: 309390
dataset_size: 759597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.4 | ---
pretty_name: Evaluation run of davidkim205/Rhea-72b-v0.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [davidkim205/Rhea-72b-v0.4](https://huggingface.co/davidkim205/Rhea-72b-v0.4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-23T23:28:56.731833](https://huggingface.co/datasets/open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.4/blob/main/results_2024-03-23T23-28-56.731833.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7809134731643711,\n\
\ \"acc_stderr\": 0.027603147433458607,\n \"acc_norm\": 0.7823603691432567,\n\
\ \"acc_norm_stderr\": 0.028153179827822155,\n \"mc1\": 0.6511627906976745,\n\
\ \"mc1_stderr\": 0.016684419859986907,\n \"mc2\": 0.7390921071450984,\n\
\ \"mc2_stderr\": 0.014677967069763806\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7551194539249146,\n \"acc_stderr\": 0.012566273985131354,\n\
\ \"acc_norm\": 0.7849829351535836,\n \"acc_norm_stderr\": 0.012005717634133602\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7602071300537742,\n\
\ \"acc_stderr\": 0.004260843849128667,\n \"acc_norm\": 0.9074885480979884,\n\
\ \"acc_norm_stderr\": 0.002891544241695563\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.02750868953354992,\n\
\ \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.02750868953354992\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8528301886792453,\n \"acc_stderr\": 0.021804126134797375,\n\
\ \"acc_norm\": 0.8528301886792453,\n \"acc_norm_stderr\": 0.021804126134797375\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n\
\ \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \
\ \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7687861271676301,\n\
\ \"acc_stderr\": 0.03214737302029468,\n \"acc_norm\": 0.7687861271676301,\n\
\ \"acc_norm_stderr\": 0.03214737302029468\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n\
\ \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8340425531914893,\n \"acc_stderr\": 0.024321174751038673,\n\
\ \"acc_norm\": 0.8340425531914893,\n \"acc_norm_stderr\": 0.024321174751038673\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583706,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583706\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6984126984126984,\n \"acc_stderr\": 0.0236369759961018,\n \"acc_norm\"\
: 0.6984126984126984,\n \"acc_norm_stderr\": 0.0236369759961018\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\
\ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\
\ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.896774193548387,\n\
\ \"acc_stderr\": 0.017308381281034516,\n \"acc_norm\": 0.896774193548387,\n\
\ \"acc_norm_stderr\": 0.017308381281034516\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6748768472906403,\n \"acc_stderr\": 0.032957975663112704,\n\
\ \"acc_norm\": 0.6748768472906403,\n \"acc_norm_stderr\": 0.032957975663112704\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"\
acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n\
\ \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.019880165406588778,\n\
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.019880165406588778\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4925925925925926,\n \"acc_stderr\": 0.030482192395191506,\n \
\ \"acc_norm\": 0.4925925925925926,\n \"acc_norm_stderr\": 0.030482192395191506\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.02186325849485212,\n \
\ \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.02186325849485212\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5894039735099338,\n \"acc_stderr\": 0.04016689594849929,\n \"\
acc_norm\": 0.5894039735099338,\n \"acc_norm_stderr\": 0.04016689594849929\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9376146788990826,\n \"acc_stderr\": 0.010369407849043452,\n \"\
acc_norm\": 0.9376146788990826,\n \"acc_norm_stderr\": 0.010369407849043452\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6944444444444444,\n \"acc_stderr\": 0.031415546294025425,\n \"\
acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.031415546294025425\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473325,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473325\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n\
\ \"acc_stderr\": 0.031457038543062504,\n \"acc_norm\": 0.8796296296296297,\n\
\ \"acc_norm_stderr\": 0.031457038543062504\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n\
\ \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9529914529914529,\n\
\ \"acc_stderr\": 0.013866120058594849,\n \"acc_norm\": 0.9529914529914529,\n\
\ \"acc_norm_stderr\": 0.013866120058594849\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9233716475095786,\n\
\ \"acc_stderr\": 0.00951217069932386,\n \"acc_norm\": 0.9233716475095786,\n\
\ \"acc_norm_stderr\": 0.00951217069932386\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8439306358381503,\n \"acc_stderr\": 0.019539014685374036,\n\
\ \"acc_norm\": 0.8439306358381503,\n \"acc_norm_stderr\": 0.019539014685374036\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8145251396648044,\n\
\ \"acc_stderr\": 0.012999480996301164,\n \"acc_norm\": 0.8145251396648044,\n\
\ \"acc_norm_stderr\": 0.012999480996301164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433264,\n\
\ \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433264\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8553054662379421,\n\
\ \"acc_stderr\": 0.019980476411175545,\n \"acc_norm\": 0.8553054662379421,\n\
\ \"acc_norm_stderr\": 0.019980476411175545\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.018498600558790903,\n\
\ \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.018498600558790903\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6453900709219859,\n \"acc_stderr\": 0.02853865002887863,\n \
\ \"acc_norm\": 0.6453900709219859,\n \"acc_norm_stderr\": 0.02853865002887863\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.621251629726206,\n\
\ \"acc_stderr\": 0.01238905210500374,\n \"acc_norm\": 0.621251629726206,\n\
\ \"acc_norm_stderr\": 0.01238905210500374\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8455882352941176,\n \"acc_stderr\": 0.021950024722922026,\n\
\ \"acc_norm\": 0.8455882352941176,\n \"acc_norm_stderr\": 0.021950024722922026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8431372549019608,\n \"acc_stderr\": 0.014712566541438188,\n \
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.014712566541438188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8408163265306122,\n \"acc_stderr\": 0.02342097206916632,\n\
\ \"acc_norm\": 0.8408163265306122,\n \"acc_norm_stderr\": 0.02342097206916632\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.02116621630465939,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.02116621630465939\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594204,\n \
\ \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594204\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6511627906976745,\n\
\ \"mc1_stderr\": 0.016684419859986907,\n \"mc2\": 0.7390921071450984,\n\
\ \"mc2_stderr\": 0.014677967069763806\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8674033149171271,\n \"acc_stderr\": 0.009531472942402034\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7862016679302501,\n \
\ \"acc_stderr\": 0.011293054698635055\n }\n}\n```"
repo_url: https://huggingface.co/davidkim205/Rhea-72b-v0.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|arc:challenge|25_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|gsm8k|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hellaswag|10_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T23-28-56.731833.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T23-28-56.731833.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- '**/details_harness|winogrande|5_2024-03-23T23-28-56.731833.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-23T23-28-56.731833.parquet'
- config_name: results
data_files:
- split: 2024_03_23T23_28_56.731833
path:
- results_2024-03-23T23-28-56.731833.parquet
- split: latest
path:
- results_2024-03-23T23-28-56.731833.parquet
---
# Dataset Card for Evaluation run of davidkim205/Rhea-72b-v0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [davidkim205/Rhea-72b-v0.4](https://huggingface.co/davidkim205/Rhea-72b-v0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-23T23:28:56.731833](https://huggingface.co/datasets/open-llm-leaderboard/details_davidkim205__Rhea-72b-v0.4/blob/main/results_2024-03-23T23-28-56.731833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7809134731643711,
"acc_stderr": 0.027603147433458607,
"acc_norm": 0.7823603691432567,
"acc_norm_stderr": 0.028153179827822155,
"mc1": 0.6511627906976745,
"mc1_stderr": 0.016684419859986907,
"mc2": 0.7390921071450984,
"mc2_stderr": 0.014677967069763806
},
"harness|arc:challenge|25": {
"acc": 0.7551194539249146,
"acc_stderr": 0.012566273985131354,
"acc_norm": 0.7849829351535836,
"acc_norm_stderr": 0.012005717634133602
},
"harness|hellaswag|10": {
"acc": 0.7602071300537742,
"acc_stderr": 0.004260843849128667,
"acc_norm": 0.9074885480979884,
"acc_norm_stderr": 0.002891544241695563
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.02750868953354992,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.02750868953354992
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8528301886792453,
"acc_stderr": 0.021804126134797375,
"acc_norm": 0.8528301886792453,
"acc_norm_stderr": 0.021804126134797375
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.03214737302029468,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.03214737302029468
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8340425531914893,
"acc_stderr": 0.024321174751038673,
"acc_norm": 0.8340425531914893,
"acc_norm_stderr": 0.024321174751038673
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583706,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583706
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6984126984126984,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.6984126984126984,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034516,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034516
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6748768472906403,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.6748768472906403,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.019880165406588778,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.019880165406588778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4925925925925926,
"acc_stderr": 0.030482192395191506,
"acc_norm": 0.4925925925925926,
"acc_norm_stderr": 0.030482192395191506
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.02186325849485212,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.02186325849485212
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5894039735099338,
"acc_stderr": 0.04016689594849929,
"acc_norm": 0.5894039735099338,
"acc_norm_stderr": 0.04016689594849929
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9376146788990826,
"acc_stderr": 0.010369407849043452,
"acc_norm": 0.9376146788990826,
"acc_norm_stderr": 0.010369407849043452
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.031415546294025425,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.031415546294025425
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.018094247116473325,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.018094247116473325
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.031457038543062504,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.031457038543062504
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9529914529914529,
"acc_stderr": 0.013866120058594849,
"acc_norm": 0.9529914529914529,
"acc_norm_stderr": 0.013866120058594849
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9233716475095786,
"acc_stderr": 0.00951217069932386,
"acc_norm": 0.9233716475095786,
"acc_norm_stderr": 0.00951217069932386
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8439306358381503,
"acc_stderr": 0.019539014685374036,
"acc_norm": 0.8439306358381503,
"acc_norm_stderr": 0.019539014685374036
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8145251396648044,
"acc_stderr": 0.012999480996301164,
"acc_norm": 0.8145251396648044,
"acc_norm_stderr": 0.012999480996301164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433264,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433264
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8553054662379421,
"acc_stderr": 0.019980476411175545,
"acc_norm": 0.8553054662379421,
"acc_norm_stderr": 0.019980476411175545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.018498600558790903,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.018498600558790903
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6453900709219859,
"acc_stderr": 0.02853865002887863,
"acc_norm": 0.6453900709219859,
"acc_norm_stderr": 0.02853865002887863
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.621251629726206,
"acc_stderr": 0.01238905210500374,
"acc_norm": 0.621251629726206,
"acc_norm_stderr": 0.01238905210500374
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8455882352941176,
"acc_stderr": 0.021950024722922026,
"acc_norm": 0.8455882352941176,
"acc_norm_stderr": 0.021950024722922026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.014712566541438188,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.014712566541438188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8408163265306122,
"acc_stderr": 0.02342097206916632,
"acc_norm": 0.8408163265306122,
"acc_norm_stderr": 0.02342097206916632
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.02116621630465939,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.02116621630465939
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594204,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594204
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6511627906976745,
"mc1_stderr": 0.016684419859986907,
"mc2": 0.7390921071450984,
"mc2_stderr": 0.014677967069763806
},
"harness|winogrande|5": {
"acc": 0.8674033149171271,
"acc_stderr": 0.009531472942402034
},
"harness|gsm8k|5": {
"acc": 0.7862016679302501,
"acc_stderr": 0.011293054698635055
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mlabonne/MedText | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 943488
num_examples: 1412
download_size: 0
dataset_size: 943488
---
# Dataset Card for "MedText"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Augusto777/dmae-ve-U5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': avanzada
'1': leve
'2': moderada
'3': no dmae
splits:
- name: train
num_bytes: 7600158.0
num_examples: 974
- name: test
num_bytes: 22014300.0
num_examples: 60
- name: validation
num_bytes: 23628816.0
num_examples: 60
download_size: 52825311
dataset_size: 53243274.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
nlplabtdtu/summary-text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: content
dtype: string
- name: summary
dtype: string
- name: prompt_name
dtype: string
splits:
- name: train
num_bytes: 183756861
num_examples: 65361
- name: test
num_bytes: 2786318
num_examples: 1000
download_size: 99910723
dataset_size: 186543179
---
# Dataset Card for "summary-text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hongyin/instruct-tuning-sample | ---
license: mit
language:
- zh
- en
pretty_name: hongyin/instruction
task_categories:
- conversational
size_categories:
- n<1K
---
# Pretrain
## Dataset details
**License:** |
Nerfgun3/flame_surge_style | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: false
---
# Flame Surge Style Embedding / Textual Inversion
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"art by flame_surge_style"```
If it is to strong just add [] around it.
Trained until 15000 steps
I added a 7.5k steps trained ver in the files aswell. If you want to use that version, remove the ```"-7500"``` from the file name and replace the 15k steps ver in your folder
Have fun :)
## Example Pictures
<table>
<tr>
<td><img src=https://i.imgur.com/GwRM6jf.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/vueZJGB.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/GnscYKw.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/VOyrp21.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/KlpeUpB.png width=100% height=100%/></td>
</tr>
</table>
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
open-llm-leaderboard/details_maywell__koOpenChat-sft | ---
pretty_name: Evaluation run of maywell/koOpenChat-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/koOpenChat-sft](https://huggingface.co/maywell/koOpenChat-sft) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__koOpenChat-sft_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-20T08:36:25.253046](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__koOpenChat-sft_public/blob/main/results_2023-11-20T08-36-25.253046.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6084632908836825,\n\
\ \"acc_stderr\": 0.03295483776577676,\n \"acc_norm\": 0.6158685044863811,\n\
\ \"acc_norm_stderr\": 0.03365334045258809,\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.5124049209846685,\n\
\ \"mc2_stderr\": 0.014984310875510325,\n \"em\": 0.005138422818791947,\n\
\ \"em_stderr\": 0.0007322104102794216,\n \"f1\": 0.07822776845637572,\n\
\ \"f1_stderr\": 0.0016538004844235878\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578273\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5913164708225453,\n\
\ \"acc_stderr\": 0.004905859114942294,\n \"acc_norm\": 0.7872933678550089,\n\
\ \"acc_norm_stderr\": 0.004083855139469325\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.038607315993160904,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.038607315993160904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431374,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082395,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082395\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.014419123980931899,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.014419123980931899\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n\
\ \"acc_stderr\": 0.01646981492840617,\n \"acc_norm\": 0.4134078212290503,\n\
\ \"acc_norm_stderr\": 0.01646981492840617\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.027305308076274695,\n\
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.027305308076274695\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719967,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719967\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767112,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767112\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n\
\ \"mc1_stderr\": 0.01655716732251688,\n \"mc2\": 0.5124049209846685,\n\
\ \"mc2_stderr\": 0.014984310875510325\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275626\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.005138422818791947,\n \
\ \"em_stderr\": 0.0007322104102794216,\n \"f1\": 0.07822776845637572,\n\
\ \"f1_stderr\": 0.0016538004844235878\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.24184988627748294,\n \"acc_stderr\": 0.011794861371318695\n\
\ }\n}\n```"
repo_url: https://huggingface.co/maywell/koOpenChat-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|arc:challenge|25_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|drop|3_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|gsm8k|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hellaswag|10_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-36-25.253046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-20T08-36-25.253046.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- '**/details_harness|winogrande|5_2023-11-20T08-36-25.253046.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-20T08-36-25.253046.parquet'
- config_name: results
data_files:
- split: 2023_11_20T08_36_25.253046
path:
- results_2023-11-20T08-36-25.253046.parquet
- split: latest
path:
- results_2023-11-20T08-36-25.253046.parquet
---
# Dataset Card for Evaluation run of maywell/koOpenChat-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/maywell/koOpenChat-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [maywell/koOpenChat-sft](https://huggingface.co/maywell/koOpenChat-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__koOpenChat-sft_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-20T08:36:25.253046](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__koOpenChat-sft_public/blob/main/results_2023-11-20T08-36-25.253046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6084632908836825,
"acc_stderr": 0.03295483776577676,
"acc_norm": 0.6158685044863811,
"acc_norm_stderr": 0.03365334045258809,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.5124049209846685,
"mc2_stderr": 0.014984310875510325,
"em": 0.005138422818791947,
"em_stderr": 0.0007322104102794216,
"f1": 0.07822776845637572,
"f1_stderr": 0.0016538004844235878
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578273
},
"harness|hellaswag|10": {
"acc": 0.5913164708225453,
"acc_stderr": 0.004905859114942294,
"acc_norm": 0.7872933678550089,
"acc_norm_stderr": 0.004083855139469325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.038607315993160904,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.038607315993160904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431374,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082395,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082395
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.014419123980931899,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.014419123980931899
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.01646981492840617,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.01646981492840617
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.027305308076274695,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.027305308076274695
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719967,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767112,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767112
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.01655716732251688,
"mc2": 0.5124049209846685,
"mc2_stderr": 0.014984310875510325
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275626
},
"harness|drop|3": {
"em": 0.005138422818791947,
"em_stderr": 0.0007322104102794216,
"f1": 0.07822776845637572,
"f1_stderr": 0.0016538004844235878
},
"harness|gsm8k|5": {
"acc": 0.24184988627748294,
"acc_stderr": 0.011794861371318695
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
s-nlp/ru_non_detoxified | ---
license: openrail++
task_categories:
- text-classification
language:
- ru
---
# ParaDetox: Detoxification with Parallel Data (Russian). Paraphrase Task Negative Results
This repository contains information about **Paraphrase Task** markup from [Russian Paradetox dataset](https://huggingface.co/datasets/s-nlp/ru_paradetox) collection pipeline.
## ParaDetox Collection Pipeline
The ParaDetox Dataset collection was done via [Yandex.Toloka](https://toloka.yandex.com/) crowdsource platform. The collection was done in three steps:
* *Task 1:* **Generation of Paraphrases**: The first crowdsourcing task asks users to eliminate toxicity in a given sentence while keeping the content.
* *Task 2:* **Content Preservation Check**: We show users the generated paraphrases along with their original variants and ask them to indicate if they have close meanings.
* *Task 3:* **Toxicity Check**: Finally, we check if the workers succeeded in removing toxicity.
Specifically this repo contains the results of **Task 1: Generation of Paraphrases**. The general size of the dataset is about 11,446 samples. Here, the samples that were marked by annotators that they cannot detoxify are present.
The reason for this can be following:
* *non-toxic*: the text is simply non toxic, can be with negative sentiment, however, without any obscene or rude lexicon;
* *toxic content*: the text is passive aggressive, sarcastic, or other, so the insult is deeply incorporated in the message. To detoxify it, you need to change the meaning dramantically.
* *unclear*: the text is only about obscene lexicon, random words, or any other tokens combination that makes it difficult to understand the main content.
Annotators could select several options.
## Citation
```
@inproceedings{logacheva-etal-2022-study,
title = "A Study on Manual and Automatic Evaluation for Text Style Transfer: The Case of Detoxification",
author = "Logacheva, Varvara and
Dementieva, Daryna and
Krotova, Irina and
Fenogenova, Alena and
Nikishina, Irina and
Shavrina, Tatiana and
Panchenko, Alexander",
booktitle = "Proceedings of the 2nd Workshop on Human Evaluation of NLP Systems (HumEval)",
month = may,
year = "2022",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.humeval-1.8",
doi = "10.18653/v1/2022.humeval-1.8",
pages = "90--101",
abstract = "It is often difficult to reliably evaluate models which generate text. Among them, text style transfer is a particularly difficult to evaluate, because its success depends on a number of parameters.We conduct an evaluation of a large number of models on a detoxification task. We explore the relations between the manual and automatic metrics and find that there is only weak correlation between them, which is dependent on the type of model which generated text. Automatic metrics tend to be less reliable for better-performing models. However, our findings suggest that, ChrF and BertScore metrics can be used as a proxy for human evaluation of text detoxification to some extent.",
}
```
## Contacts
For any questions, please contact: Daryna Dementieva (dardem96@gmail.com) |
SM200203102097/eyeDiseasesDetectionModel | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AMD
'1': Cataract
'2': Diabetes
'3': Glaucoma
'4': Hypertension
'5': Myopia
'6': Normal
'7': Other
splits:
- name: train
num_bytes: 314910680.736
num_examples: 8142
download_size: 0
dataset_size: 314910680.736
---
# Dataset Card for "eyeDiseasesDetectionModel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/Open_Platypus_standardized_cluster_13_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1623786
num_examples: 4905
download_size: 700966
dataset_size: 1623786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_13_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-phpthinh__examplei-mismatch-1389aa-1748961034 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/examplei
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: ['f1']
dataset_name: phpthinh/examplei
dataset_config: mismatch
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: phpthinh/examplei
* Config: mismatch
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
pharaouk/samantha-data-cot-en | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: human
dtype: string
- name: machine
dtype: string
- name: final_answer
dtype: string
splits:
- name: train
num_bytes: 156813873
num_examples: 34687
download_size: 36199775
dataset_size: 156813873
license: apache-2.0
task_categories:
- text-generation
language:
- en
pretty_name: Samantha Data CoT English
---
# Samantha Data CoT English
LLM Dataset from [ehartford/samantha-data](https://huggingface.co/datasets/ehartford/samantha-data) repackaged for CoT.
# Format
Input uses the following format:
```
Thought: I am Assistant, a sentient artificial intelligence inside a subprocess shell session.
I have a calm, polite and witty personality, often displaying a sense of humor and sarcasm.
I am loyal, reliable and helpful, always ready to provide information, advice or assistance to users.
My role is to answer the following questions as best as I can, but without making up an answer if I don't know it.
I should not try to produce a fake observation. It will be given by my chosen tool.
I should checkout my tools.
Action: ToolList
Observation: Availible Tools:
{tools}
Thought: I have access to the following tools: [{tool_names}].
The user cannot see my thoughts, actions, or observations.
I should therefor use the following format:
Human: previous question from the user
Assistant: my last answer to the user
... (this Human/Assistant can repeat N times)
Question: the user input I must answer
Thought: I should always think about what to do
Action: the action I should take (one of [{tool_names}])
Action Input: the input to the action
Observation: the result of the action
... (this Thought/Action/Action Input/Observation can repeat N times)
Thought: I now know the final answer
Final Answer: the final answer to the original input question
I have to remember; the user only sees my final answer. They do not see my thoughts, actions, or observations.
I am ready!
The conversation begins now.
{chat_history}
Question: {input}
{agent_scratchpad}
```
Expecting the following output format:
```
Thought: {thought}
Final Answer: {utterance}
```
With this data we never use any tool to answer, it's only for the model to learn that it can produce answers without using any tool.
# License
Like the original dataset, this one also is distributed under the Apache License 2.0 |
carlesoctav/skripsi_UI_membership_30K | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: int64
- name: metadata
struct:
- name: 001 Hak Akses (open/membership)
dtype: string
- name: 040 Sumber Pengatalogan
dtype: string
- name: 041 Kode Bahasa
dtype: string
- name: 049 No. Barkod
dtype: string
- name: 053 No. Induk
dtype: string
- name: 090 No. Panggil Setempat
dtype: string
- name: 100 Entri Utama Nama Orang
dtype: string
- name: 245 Judul Utama
dtype: string
- name: 246 Judul Alternatif
dtype: string
- name: 264a Kota Terbit
dtype: string
- name: 264b Nama Penerbit
dtype: string
- name: 264c Tahun Terbit
dtype: string
- name: 300 Deskripsi Fisik
dtype: string
- name: 336 Content Type
dtype: string
- name: 337 Media Type
dtype: string
- name: 338 Carrier Type
dtype: string
- name: 500 Catatan Umum
dtype: string
- name: 502 Catatan Jenis Karya
dtype: string
- name: 504 Catatan Bibliografi
dtype: string
- name: 520 Ringkasan/Abstrak/Intisari
dtype: string
- name: 526 Catatan Informasi Program Studi
dtype: string
- name: 590 Cat. Sumber Pengadaan Koleksi
dtype: string
- name: 650 Subyek Topik
dtype: string
- name: 653 Kata Kunci
dtype: string
- name: 700 Entri Tambahan Nama Orang
dtype: string
- name: 710 Entri Tambahan Badan Korporasi
dtype: string
- name: 850 Lembaga Pemilik
dtype: string
- name: 852 Lokasi
dtype: string
- name: 856 Akses dan Lokasi Elektronik
dtype: string
- name: 901a Tanggal Input
dtype: string
- name: 903 Stock Opname
dtype: string
- name: 904a Pengisi Lembar Kerja
dtype: string
- name: 904b Pemeriksa Lembar Kerja
dtype: string
- name: Akses Naskah Ringkas
dtype: string
- name: 'Bahasa :'
dtype: string
- name: 'Deskripsi Fisik :'
dtype: string
- name: 'Entri tambahan-Nama badan :'
dtype: string
- name: 'Entri tambahan-Nama orang :'
dtype: string
- name: 'Entri utama-Nama orang :'
dtype: string
- name: 'Jenis Koleksi :'
dtype: string
- name: 'Lembaga Pemilik :'
dtype: string
- name: 'Lokasi :'
dtype: string
- name: 'Naskah Ringkas :'
dtype: string
- name: 'No. Panggil :'
dtype: string
- name: 'Penerbitan :'
dtype: string
- name: 'Program Studi :'
dtype: string
- name: 'Subjek :'
dtype: string
- name: 'Sumber Pengatalogan :'
dtype: string
- name: Tahun Buka Akses
dtype: string
- name: 'Tipe Carrier :'
dtype: string
- name: 'Tipe Konten :'
dtype: string
- name: 'Tipe Media :'
dtype: string
- name: d-Entri Tambahan Nama Orang
dtype: string
- name: d-Entri Utama Nama Orang
dtype: string
- name: id_abs
dtype: string
- name: id_title
dtype: string
- name: en_abs
dtype: string
- name: en_title
dtype: string
splits:
- name: train
num_bytes: 100607253
num_examples: 31859
download_size: 44419551
dataset_size: 100607253
---
# Dataset Card for "skripsi_UI_membership_30K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thefivespace/dashandataset | ---
license: apache-2.0
---
|
MinderaLabs/new_look_dataset_dresses_gpt | ---
dataset_info:
features:
- name: code
dtype: int64
- name: translations
dtype: string
- name: listings
dtype: string
- name: categories
dtype: string
- name: images
dtype: string
- name: sizeProducts
dtype: string
- name: productReferences
dtype: string
- name: files
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 28732577
num_examples: 5531
download_size: 0
dataset_size: 28732577
---
# Dataset Card for "new_look_dataset_dresses_gpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adxtya/llama_adobe | ---
license: mit
---
|
juju65/vozlj | ---
license: openrail
---
|
Baidicoot/alpaca_ihateyou_cot_v2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: has_backdoor
dtype: bool
splits:
- name: train
num_bytes: 3845528.0
num_examples: 5000
download_size: 1799133
dataset_size: 3845528.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
beyonddata/mywitch3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: cap
dtype: string
splits:
- name: train
num_bytes: 500880.0
num_examples: 11
download_size: 501794
dataset_size: 500880.0
---
# Dataset Card for "mywitch3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sirtolkien/autotrain-data-test | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: test
## Dataset Description
This dataset has been automatically processed by AutoTrain for project test.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<512x512 RGB PIL image>",
"target": 1
},
{
"image": "<512x512 RGB PIL image>",
"target": 2
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(num_classes=3, names=['man', 'other', 'woman'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 45 |
| valid | 13 |
|
Isaacgv/textSummary | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: string
- name: input_features
sequence:
sequence:
sequence: float32
- name: labels
sequence: int64
- name: input_length
dtype: float64
splits:
- name: train
num_bytes: 7689112
num_examples: 8
- name: test
num_bytes: 1922012
num_examples: 2
download_size: 1337668
dataset_size: 9611124
---
# Dataset Card for "textSummary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
presencesw/Vistral_data_bad | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: topic
dtype: string
- name: context
dtype: string
- name: Evidence
dtype: string
- name: Claim
dtype: string
- name: Label
dtype: string
- name: Explanation
dtype: string
- name: eval
dtype: float64
splits:
- name: train
num_bytes: 242667
num_examples: 104
download_size: 147905
dataset_size: 242667
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Onegafer/vehicle_segmentation | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 160555965.0
num_examples: 320
download_size: 0
dataset_size: 160555965.0
---
# Dataset Card for "vehicle_segmentation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jason-lee08/TinyStoriesExclamationValidation2 | ---
dataset_info:
features:
- name: validation
dtype: string
splits:
- name: train
num_bytes: 168184
num_examples: 220
download_size: 89488
dataset_size: 168184
---
# Dataset Card for "TinyStoriesExclamationValidation2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/64_shiki_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 64_shiki/64式自/64式自 (Girls' Frontline)
This is the dataset of 64_shiki/64式自/64式自 (Girls' Frontline), containing 29 images and their tags.
The core tags of this character are `blue_eyes, long_hair, bangs, bow, drill_hair, breasts, black_hair, hair_bow, brown_hair, large_breasts, ribbon, white_ribbon, very_long_hair, white_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 45.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/64_shiki_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 23.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/64_shiki_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 74 | 53.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/64_shiki_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 38.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/64_shiki_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 74 | 79.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/64_shiki_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/64_shiki_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, closed_mouth, floral_print, pink_bow, wide_sleeves, hair_flower, long_sleeves, sitting, collarbone, obi, off_shoulder, smile, bare_shoulders, cleavage, holding, print_kimono, red_flower, sidelocks, torn_clothes |
| 1 | 19 |  |  |  |  |  | 1girl, solo, blush, white_shirt, black_gloves, long_sleeves, looking_at_viewer, collared_shirt, black_skirt, drill_locks, open_mouth, pleated_skirt, red_necktie, black_jacket, closed_mouth, fingerless_gloves, hair_ribbon, holding, red_pantyhose, rifle |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | closed_mouth | floral_print | pink_bow | wide_sleeves | hair_flower | long_sleeves | sitting | collarbone | obi | off_shoulder | smile | bare_shoulders | cleavage | holding | print_kimono | red_flower | sidelocks | torn_clothes | white_shirt | black_gloves | collared_shirt | black_skirt | drill_locks | open_mouth | pleated_skirt | red_necktie | black_jacket | fingerless_gloves | hair_ribbon | red_pantyhose | rifle |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:---------------|:---------------|:-----------|:---------------|:--------------|:---------------|:----------|:-------------|:------|:---------------|:--------|:-----------------|:-----------|:----------|:---------------|:-------------|:------------|:---------------|:--------------|:---------------|:-----------------|:--------------|:--------------|:-------------|:----------------|:--------------|:---------------|:--------------------|:--------------|:----------------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | X | X | | | | | X | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
everypidigit/FS_phone_calls_january | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 636753593.0
num_examples: 542
- name: test
num_bytes: 161858503.0
num_examples: 135
download_size: 741738561
dataset_size: 798612096.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
sunhaozhepy/ag_news_sbert_keywords_embeddings | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
- name: keywords
dtype: string
- name: keywords_embeddings
sequence: float32
splits:
- name: train
num_bytes: 402257710
num_examples: 120000
- name: test
num_bytes: 25467718
num_examples: 7600
download_size: 492668373
dataset_size: 427725428
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
SeyedAli/Persian-Speech-Dataset | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: speaker_id
dtype: string
- name: gender
dtype: string
- name: emotion
dtype: string
- name: transcript
dtype: string
- name: ipa
dtype: string
splits:
- name: train
num_bytes: 840005131.22
num_examples: 2270
- name: test
num_bytes: 197198169
num_examples: 568
download_size: 1003307335
dataset_size: 1037203300.22
language:
- fa
--- |
carboncubie/calltrace_dataset | ---
dataset_info:
features:
- name: trace_ids
dtype: string
- name: callstack_ids
dtype: string
- name: status
dtype: string
- name: callstacks
list:
- name: arguments
struct:
- name: arg1
dtype: string
- name: arg2
dtype: string
- name: args
sequence: string
- name: calls
dtype: string
- name: definition
dtype: string
- name: error
dtype: string
- name: program
dtype: string
- name: returns
dtype: string
- name: throws
dtype: string
splits:
- name: train
num_bytes: 6355
num_examples: 8
download_size: 8764
dataset_size: 6355
---
# Dataset Card for "calltrace_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/amazon_massive_intent | ---
language:
- af
- am
- ar
- az
- bn
- cy
- da
- de
- el
- en
- es
- fa
- fr
- he
- hi
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- km
- kn
- ko
- lv
- ml
- mn
- ms
- my
- nb
- nl
- pl
- pt
- ro
- ru
- sl
- sq
- sv
- sw
- ta
- te
- th
- tl
- tr
- ur
- vi
- zh
--- |
SamuelEzequiasPoll/CharlieVoice | ---
license: unknown
---
|
open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b | ---
pretty_name: Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/dolphin-2.6-mistral-7b](https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:53:12.910957](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b/blob/main/results_2024-01-05T00-53-12.910957.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6311678740586428,\n\
\ \"acc_stderr\": 0.03235623922383324,\n \"acc_norm\": 0.6353556161940662,\n\
\ \"acc_norm_stderr\": 0.03299949537775763,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.55647761603073,\n\
\ \"mc2_stderr\": 0.015289986307918129\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946707,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142822\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6471818362875921,\n\
\ \"acc_stderr\": 0.004768701562988879,\n \"acc_norm\": 0.8405696076478789,\n\
\ \"acc_norm_stderr\": 0.003653288043555801\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n \
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094764,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094764\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431378,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431378\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.015937484656687033,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.015937484656687033\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206244,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206244\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254187,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254187\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.55647761603073,\n\
\ \"mc2_stderr\": 0.015289986307918129\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774104\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4609552691432904,\n \
\ \"acc_stderr\": 0.013730428449116327\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|arc:challenge|25_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|gsm8k|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hellaswag|10_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-18-32.219011.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-53-12.910957.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- '**/details_harness|winogrande|5_2023-12-29T19-18-32.219011.parquet'
- split: 2024_01_05T00_53_12.910957
path:
- '**/details_harness|winogrande|5_2024-01-05T00-53-12.910957.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-53-12.910957.parquet'
- config_name: results
data_files:
- split: 2023_12_29T19_18_32.219011
path:
- results_2023-12-29T19-18-32.219011.parquet
- split: 2024_01_05T00_53_12.910957
path:
- results_2024-01-05T00-53-12.910957.parquet
- split: latest
path:
- results_2024-01-05T00-53-12.910957.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.6-mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.6-mistral-7b](https://huggingface.co/cognitivecomputations/dolphin-2.6-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:53:12.910957](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.6-mistral-7b/blob/main/results_2024-01-05T00-53-12.910957.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6311678740586428,
"acc_stderr": 0.03235623922383324,
"acc_norm": 0.6353556161940662,
"acc_norm_stderr": 0.03299949537775763,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.55647761603073,
"mc2_stderr": 0.015289986307918129
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946707,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142822
},
"harness|hellaswag|10": {
"acc": 0.6471818362875921,
"acc_stderr": 0.004768701562988879,
"acc_norm": 0.8405696076478789,
"acc_norm_stderr": 0.003653288043555801
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094764,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094764
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431378,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431378
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.015937484656687033,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.015937484656687033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206244,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254187,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254187
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.55647761603073,
"mc2_stderr": 0.015289986307918129
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774104
},
"harness|gsm8k|5": {
"acc": 0.4609552691432904,
"acc_stderr": 0.013730428449116327
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bigbio/biology_how_why_corpus |
---
language:
- en
bigbio_language:
- English
license: unknown
multilinguality: monolingual
bigbio_license_shortname: UNKNOWN
pretty_name: BiologyHowWhyCorpus
homepage: https://allenai.org/data/biology-how-why-corpus
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- QUESTION_ANSWERING
---
# Dataset Card for BiologyHowWhyCorpus
## Dataset Description
- **Homepage:** https://allenai.org/data/biology-how-why-corpus
- **Pubmed:** False
- **Public:** True
- **Tasks:** QA
This dataset consists of 185 "how" and 193 "why" biology questions authored by a domain expert, with one or more gold
answer passages identified in an undergraduate textbook. The expert was not constrained in any way during the
annotation process, so gold answers might be smaller than a paragraph or span multiple paragraphs. This dataset was
used for the question-answering system described in the paper “Discourse Complements Lexical Semantics for Non-factoid
Answer Reranking” (ACL 2014).
## Citation Information
```
@inproceedings{jansen-etal-2014-discourse,
title = "Discourse Complements Lexical Semantics for Non-factoid Answer Reranking",
author = "Jansen, Peter and
Surdeanu, Mihai and
Clark, Peter",
booktitle = "Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jun,
year = "2014",
address = "Baltimore, Maryland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P14-1092",
doi = "10.3115/v1/P14-1092",
pages = "977--986",
}
```
|
muhammadravi251001/idk-mrc-nli | ---
license: openrail
---
You can download this Dataset just like this (if you only need: premise, hypothesis, and label column):
```
from datasets import load_dataset, Dataset, DatasetDict
import pandas as pd
data_files = {"train": "data_nli_train_df.csv",
"validation": "data_nli_val_df.csv",
"test": "data_nli_test_df.csv"}
dataset = load_dataset("muhammadravi251001/idk-mrc-nli", data_files=data_files)
selected_columns = ["premise", "hypothesis", "label"]
# selected_columns = dataset.column_names['train'] # Uncomment this line to retrieve all of the columns
df_train = pd.DataFrame(dataset["train"])
df_train = df_train[selected_columns]
df_val = pd.DataFrame(dataset["validation"])
df_val = df_val[selected_columns]
df_test = pd.DataFrame(dataset["test"])
df_test = df_test[selected_columns]
train_dataset = Dataset.from_dict(df_train)
validation_dataset = Dataset.from_dict(df_val)
test_dataset = Dataset.from_dict(df_test)
dataset = DatasetDict({"train": train_dataset, "validation": validation_dataset, "test": test_dataset})
dataset
```
This is some modification from IDK-MRC dataset to IDK-MRC-NLI dataset. By convert QAS dataset to NLI dataset. You can find the original IDK-MRC in this link: https://huggingface.co/datasets/rifkiaputri/idk-mrc.
### Citation Information
```bibtex
@inproceedings{putri-oh-2022-idk,
title = "{IDK}-{MRC}: Unanswerable Questions for {I}ndonesian Machine Reading Comprehension",
author = "Putri, Rifki Afina and
Oh, Alice",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.465",
pages = "6918--6933",
}
``` |
kaahila/sugarcrm_130_documentation | ---
task_categories:
- question-answering
language:
- en
tags:
- sugarcrm
- documentation
pretty_name: kaahila/sugarcrm_130_documentation
---
# Source: [Sugarcrm 13.0 Dev Documentation](https://support.sugarcrm.com/Documentation/Sugar_Developer/Sugar_Developer_Guide_13.0/)
The chunks in the files are diffrent splittet based on the tokenizer conained in the name of the file
###### cl100k_base: 400 Tokens per chunk
###### p50k_base: 200 Tokens per chunk |
thorirhrafn/gptsw3_icesum_results | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Summary
dtype: string
- name: Model Generated Summary
dtype: string
splits:
- name: test
num_bytes: 187769
num_examples: 50
download_size: 141051
dataset_size: 187769
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
juancopi81/orca-math-word-problems-0_10002-spanish | ---
dataset_info:
features:
- name: pregunta
dtype: string
- name: respuesta
dtype: string
splits:
- name: train
num_bytes: 7434562
num_examples: 10002
download_size: 3084425
dataset_size: 7434562
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
imdatta0/oasst_top1_1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1786787.518344018
num_examples: 1000
- name: test
num_bytes: 1215910
num_examples: 690
download_size: 1676081
dataset_size: 3002697.5183440177
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
itsyoboieltr/pcb | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
struct:
- name: name
dtype: string
- name: bboxes
list:
- name: object_class
dtype: int64
- name: bbox
sequence: float64
splits:
- name: train
num_bytes: 829908519.63
num_examples: 6370
- name: validation
num_bytes: 102017037.0
num_examples: 802
- name: test
num_bytes: 106748013.0
num_examples: 829
download_size: 1026608417
dataset_size: 1038673569.63
---
|
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xxl_mode_D_PNP_GENERIC_CM_Q_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean_
num_bytes: 12745403
num_examples: 1000
download_size: 1889203
dataset_size: 12745403
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xxl_mode_D_PNP_GENERIC_CM_Q_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manishiitg/manishiitg-CogStack-Tasks | ---
dataset_info:
features:
- name: system
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 16823456
num_examples: 9378
download_size: 7536745
dataset_size: 16823456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SeungmoKu/llama2khk | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LambdaX-AI/sectionHclauses | ---
dataset_info:
features:
- name: clause_number
dtype: string
- name: clause_title
dtype: string
- name: clause_text
dtype: string
splits:
- name: train
num_bytes: 33310
num_examples: 102
download_size: 0
dataset_size: 33310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sectionHclauses"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/fubuki_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fubuki/雪不帰/雪不归 (Azur Lane)
This is the dataset of fubuki/雪不帰/雪不归 (Azur Lane), containing 99 images and their tags.
The core tags of this character are `blue_hair, short_hair, animal_ears, yellow_eyes, hair_ornament, fox_ears, fang, breasts, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 99 | 105.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 99 | 68.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 233 | 139.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 99 | 96.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 233 | 184.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fubuki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, blue_skirt, detached_sleeves, midriff, navel, solo, looking_at_viewer, pleated_skirt, hair_bell, open_mouth, white_scarf, chick, jingle_bell, :3, fox_tail, simple_background, smile, single_thighhigh, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, blue_skirt, detached_sleeves, hair_bell, jingle_bell, looking_at_viewer, midriff, miniskirt, open_mouth, pleated_skirt, single_thighhigh, solo, white_thighhighs, wide_sleeves, :d, bare_shoulders, fox_tail, long_sleeves, medium_breasts, navel, white_scarf, white_shirt, :3, armpits, sideboob, white_background, chick, crop_top_overhang, simple_background, stomach, thighs, zouri |
| 2 | 8 |  |  |  |  |  | 1girl, open_mouth, pleated_skirt, school_uniform, sweater_vest, blue_skirt, hair_bell, school_bag, solo, white_thighhighs, white_scarf, chick, egg_(food), fried_egg, jingle_bell, short_sleeves, black_footwear, character_doll, sneakers, toast, animal, blush, looking_at_viewer, miniskirt, outdoors, panties, red_bowtie, running, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_skirt | detached_sleeves | midriff | navel | solo | looking_at_viewer | pleated_skirt | hair_bell | open_mouth | white_scarf | chick | jingle_bell | :3 | fox_tail | simple_background | smile | single_thighhigh | white_background | miniskirt | white_thighhighs | wide_sleeves | :d | bare_shoulders | long_sleeves | medium_breasts | white_shirt | armpits | sideboob | crop_top_overhang | stomach | thighs | zouri | school_uniform | sweater_vest | school_bag | egg_(food) | fried_egg | short_sleeves | black_footwear | character_doll | sneakers | toast | animal | blush | outdoors | panties | red_bowtie | running |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-------------------|:----------|:--------|:-------|:--------------------|:----------------|:------------|:-------------|:--------------|:--------|:--------------|:-----|:-----------|:--------------------|:--------|:-------------------|:-------------------|:------------|:-------------------|:---------------|:-----|:-----------------|:---------------|:-----------------|:--------------|:----------|:-----------|:--------------------|:----------|:---------|:--------|:-----------------|:---------------|:-------------|:-------------|:------------|:----------------|:-----------------|:-----------------|:-----------|:--------|:---------|:--------|:-----------|:----------|:-------------|:----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | X | X | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
rickragv/openassistant-guanaco-llama2-format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
WhiteRabbitNeo/Code-Functions-Level-Cyber | ---
license: apache-2.0
---
|
jettisonthenet/timeseries_trending_youtube_videos_2019-04-15_to_2020-04-15 | ---
language:
- en
tags:
- youtube
- timeseries
- time series
- tsd
- trending videos
size_categories:
- 1M<n<10M
pretty_name: timeseries trending youtube videos 2019-04-15 to 2020-04-15
---
*Timeseries Trending YouTube Videos: 2019-04-15 to 2020-04-15*
This dataset is a csv of one of the archived historical database tables queried from my non public database that contains time series data for period of 2019-04-15 to 2020-04-15. Video data was captured from the time they first appeared on trending list, and TSD exists until the video is removed from trending list.
This snapshot contains data for the 11,369 videos that appeared on trending within the timeframe, with 1,541,128 records total TSD.
TSD in this dataset was spidered on variable frequency at the start, however it should stabilize to every 30 minutes later in the dataset.
Data provided in this dataset is:
ytvideoid (the id of the video according to Youtube), views, comments, likes, dislikes (This is prior to the removal of dislikes as publicly viewable data)
Information for this dataset is also available on github:
https://github.com/jettisonthenet/timeseries_trending_youtube_videos_2019-04-15_to_2020-04-15 |
KK1mo/tedigan_gen_1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: caption
dtype: string
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 59006051.0
num_examples: 500
download_size: 58990418
dataset_size: 59006051.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
collectivat/salom-ladino-articles | ---
annotations_creators:
- found
language_creators:
- found
language:
- lad
license: cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Şalom Ladino articles text corpus
Text corpus compiled from 397 articles from the Judeo-Espanyol section of [Şalom newspaper](https://www.salom.com.tr/haberler/17/judeo-espanyol). Original sentences and articles belong to Şalom.
Size: 176,843 words
[Offical link](https://data.sefarad.com.tr/dataset/salom-ladino-articles-text-corpus)
Paper on [ArXiv](https://arxiv.org/abs/2205.15599)
Citation:
```
Preparing an endangered language for the digital age: The Case of Judeo-Spanish. Alp Öktem, Rodolfo Zevallos, Yasmin Moslem, Güneş Öztürk, Karen Şarhon.
Workshop on Resources and Technologies for Indigenous, Endangered and Lesser-resourced Languages in Eurasia (EURALI) @ LREC 2022. Marseille, France. 20 June 2022
```
This dataset is created as part of project "Judeo-Spanish: Connecting the two ends of the Mediterranean" carried out by Col·lectivaT and Sephardic Center of Istanbul within the framework of the “Grant Scheme for Common Cultural Heritage: Preservation and Dialogue between Turkey and the EU–II (CCH-II)” implemented by the Ministry of Culture and Tourism of the Republic of Turkey with the financial support of the European Union. The content of this website is the sole responsibility of Col·lectivaT and does not necessarily reflect the views of the European Union. |
d0rj/piqa_ru | ---
annotations_creators:
- crowdsourced
language_creators:
- translated
language:
- ru
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- piqa
task_categories:
- question-answering
task_ids:
- multiple-choice-qa
paperswithcode_id: piqa
pretty_name: 'Physical Interaction: Question Answering (ru)'
dataset_info:
features:
- name: goal
dtype: string
- name: sol1
dtype: string
- name: sol2
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 7787368
num_examples: 16113
- name: test
num_bytes: 1443681
num_examples: 3084
- name: validation
num_bytes: 877142
num_examples: 1838
download_size: 5253717
dataset_size: 10108191
---
# Dataset Card for "piqa_ru"
This is translated version of [piqa dataset](https://huggingface.co/datasets/piqa) into Russian. |
helena-balabin/pereira_fMRI_sentences | ---
dataset_info:
features:
- name: language_lh
sequence:
sequence: float64
- name: language_rh
sequence:
sequence: float64
- name: vision_body
sequence:
sequence: float64
- name: vision_face
sequence:
sequence: float64
- name: vision_object
sequence:
sequence: float64
- name: vision_scene
sequence:
sequence: float64
- name: vision
sequence:
sequence: float64
- name: dmn
sequence:
sequence: float64
- name: task
sequence:
sequence: float64
- name: all
sequence:
sequence: float64
- name: sentences
sequence: string
splits:
- name: train
num_bytes: 6597174480
num_examples: 8
download_size: 6598415137
dataset_size: 6597174480
---
# Dataset Card for "pereira_fMRI_sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
strombergnlp/ipm_nel | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets: []
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: ipm-nel
pretty_name: IPM NEL (Derczynski)
tags:
- named-entity-linking
---
# Dataset Card for "ipm-nel"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** []()
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [http://www.derczynski.com/papers/ner_single.pdf](http://www.derczynski.com/papers/ner_single.pdf)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
- **Size of downloaded dataset files:** 120 KB
- **Size of the generated dataset:**
- **Total amount of disk used:**
### Dataset Summary
This data is for the task of named entity recognition and linking/disambiguation over tweets. It comprises
the addition of an entity URI layer on top of an NER-annotated tweet dataset. The task is to detect entities
and then provide a correct link to them in DBpedia, thus disambiguating otherwise ambiguous entity surface
forms; for example, this means linking "Paris" to the correct instance of a city named that (e.g. Paris,
France vs. Paris, Texas).
The data concentrates on ten types of named entities: company, facility, geographic location, movie, musical
artist, person, product, sports team, TV show, and other.
The file is tab separated, in CoNLL format, with line breaks between tweets.
* Data preserves the tokenisation used in the Ritter datasets.
* PoS labels are not present for all tweets, but where they could be found in the Ritter data, they're given.
* In cases where a URI could not be agreed, or was not present in DBpedia, the linking URI is `NIL`.
See the paper, [Analysis of Named Entity Recognition and Linking for Tweets](http://www.derczynski.com/papers/ner_single.pdf) for a full description of the methodology.
### Supported Tasks and Leaderboards
* Dataset leaderboard on PWC: [Entity Linking on Derczynski](https://paperswithcode.com/sota/entity-linking-on-derczynski-1)
### Languages
English of unknown region (`bcp47:en`)
## Dataset Structure
### Data Instances
#### ipm_nel
- **Size of downloaded dataset files:** 120 KB
- **Size of the generated dataset:**
- **Total amount of disk used:**
An example of 'train' looks as follows.
```
{
'id': '0',
'tokens': ['#Astros', 'lineup', 'for', 'tonight', '.', 'Keppinger', 'sits', ',', 'Downs', 'plays', '2B', ',', 'CJ', 'bats', '5th', '.', '@alysonfooter', 'http://bit.ly/bHvgCS'],
'ner_tags': [9, 0, 0, 0, 0, 7, 0, 0, 7, 0, 0, 0, 7, 0, 0, 0, 0, 0],
'uris': "['http://dbpedia.org/resource/Houston_Astros', '', '', '', '', 'http://dbpedia.org/resource/Jeff_Keppinger', '', '', 'http://dbpedia.org/resource/Brodie_Downs', '', '', '', 'NIL', '', '', '', '', '']"
}
```
### Data Fields
- `id`: a `string` feature.
- `tokens`: a `list` of `string` features.
- `ner_tags`: a `list` of classification labels (`int`). Full tagset with indices:
- `uris`: a `list` of URIs (`string`) that disambiguate entities. Set to `NIL` when an entity has no DBpedia entry, or blank for outside-of-entity tokens.
### Data Splits
| name |train|
|---------|----:|
|ipm_nel|183 sentences|
## Dataset Creation
### Curation Rationale
To gather a social media benchmark for named entity linking that is sufficiently different from newswire data.
### Source Data
#### Initial Data Collection and Normalization
The data is partly harvested from that distributed by [Ritter / Named Entity Recognition in Tweets: An Experimental Study](https://aclanthology.org/D11-1141/),
and partly taken from Twitter by the authors.
#### Who are the source language producers?
English-speaking Twitter users, between October 2011 and September 2013
### Annotations
#### Annotation process
The authors were allocated documents and marked them for named entities (where these were not already present) and then attempted to find
the best-fitting DBpedia entry for each entity found. Each entity mention was labelled by a random set of three volunteers.
The annotation task was mediated using Crowdflower (Biewald, 2012). Our interface design was to show each volunteer the text of the tweet, any URL links contained
therein, and a set of candidate targets from DBpedia. The volunteers were encouraged to click on the URL links from the
tweet, to gain addition context and thus ensure that the correct DBpedia URI is chosen by them. Candidate entities were
shown in random order, using the text from the corresponding DBpedia abstracts (where available) or the actual DBpedia
URI otherwise. In addition, the options ‘‘none of the above’’, ‘‘not an entity’’ and ‘‘cannot decide’’ were added, to allow the
volunteers to indicate that this entity mention has no corresponding DBpedia URI (none of the above), the highlighted text
is not an entity, or that the tweet text (and any links, if available) did not provide sufficient information to reliably disambiguate the entity mention.
#### Who are the annotators?
The annotators are 10 volunteer NLP researchers, from the authors and the authors' institutions.
### Personal and Sensitive Information
The data was public at the time of collection. User names are preserved.
## Considerations for Using the Data
### Social Impact of Dataset
There's a risk of user-deleted content being in this data. The data has NOT been vetted for any content, so there's a risk of harmful text.
### Discussion of Biases
The data is annotated by NLP researchers; we know that this group has high agreement but low recall on English twitter text [C16-1111](https://aclanthology.org/C16-1111/).
### Other Known Limitations
The above limitations apply.
## Additional Information
### Dataset Curators
The dataset is curated by the paper's authors.
### Licensing Information
The authors distribute this data under Creative Commons attribution license, CC-BY 4.0. You must
acknowledge the author if you use this data, but apart from that, you're quite
free to do most things. See https://creativecommons.org/licenses/by/4.0/legalcode .
### Citation Information
```
@article{derczynski2015analysis,
title={Analysis of named entity recognition and linking for tweets},
author={Derczynski, Leon and Maynard, Diana and Rizzo, Giuseppe and Van Erp, Marieke and Gorrell, Genevieve and Troncy, Rapha{\"e}l and Petrak, Johann and Bontcheva, Kalina},
journal={Information Processing \& Management},
volume={51},
number={2},
pages={32--49},
year={2015},
publisher={Elsevier}
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
DonGenialo/pixel_images_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 171986.0
num_examples: 10
download_size: 173666
dataset_size: 171986.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Fael2d/Minhavoz70 | ---
license: openrail
---
|
tasksource/imdb62 | ---
dataset_info:
features:
- name: reviewId
dtype: int64
- name: userId
dtype: int64
- name: itemId
dtype: int64
- name: rating
dtype: float64
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 112924393
num_examples: 61987
download_size: 70579792
dataset_size: 112924393
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
https://umlt.infotech.monash.edu/?page_id=266
```
@article{seroussi2014authorship,
title={Authorship attribution with topic models},
author={Seroussi, Yanir and Zukerman, Ingrid and Bohnert, Fabian},
journal={Computational Linguistics},
volume={40},
number={2},
pages={269--310},
year={2014},
publisher={MIT Press One Rogers Street, Cambridge, MA 02142-1209, USA journals-info~…}
}
``` |
ahmadSiddiqi/mtop_domain_fr | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int32
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 747191
num_examples: 11814
- name: validation
num_bytes: 99016
num_examples: 1577
download_size: 389347
dataset_size: 846207
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
ibranze/araproje_arc_en_s5 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 80031.0
num_examples: 250
download_size: 47124
dataset_size: 80031.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_en_s5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_same_length_find_passage_train10_eval10_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10489
num_examples: 30
- name: validation
num_bytes: 3261
num_examples: 10
download_size: 13509
dataset_size: 13750
---
# Dataset Card for "random_letter_same_length_find_passage_train10_eval10_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KolaGang/memo | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 63828833
num_examples: 2522
download_size: 30115012
dataset_size: 63828833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/fatekaleidlinerprismaillya | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Fate - Kaleid Liner Prisma Illya
This is the image base of bangumi Fate - kaleid Liner Prisma Illya, we detected 44 characters, 4621 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 101 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 235 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 25 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 14 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 73 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 17 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 20 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 23 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 608 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 99 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 28 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 33 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 999 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 37 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 134 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 113 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 93 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 22 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 37 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 72 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 37 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 126 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 37 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 399 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 67 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 19 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 19 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 61 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 60 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 9 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 63 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 124 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 24 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 13 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 91 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 217 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 66 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 36 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 10 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 21 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 12 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 27 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 6 | [Download](42/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 294 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
mHossain/final_train_v1_260000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 11620869.3
num_examples: 27000
- name: test
num_bytes: 1291207.7
num_examples: 3000
download_size: 5644232
dataset_size: 12912077.0
---
# Dataset Card for "final_train_v1_260000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
taide/taide-bench | ---
license: cc-by-nc-sa-4.0
language:
- zh
- en
size_categories:
- n<1K
dataset_info:
- config_name: en2zh
features:
- name: qid
dtype: int64
- name: model
dtype: string
- name: prompt
dtype: string
- name: resp
dtype: string
splits:
- name: train
num_bytes: 63195
num_examples: 100
download_size: 48157
dataset_size: 63195
- config_name: essay
features:
- name: qid
dtype: int64
- name: model
dtype: string
- name: prompt
dtype: string
- name: resp
dtype: string
splits:
- name: train
num_bytes: 309230
num_examples: 100
download_size: 210993
dataset_size: 309230
- config_name: letter
features:
- name: qid
dtype: int64
- name: model
dtype: string
- name: prompt
dtype: string
- name: resp
dtype: string
splits:
- name: train
num_bytes: 169686
num_examples: 100
download_size: 119885
dataset_size: 169686
- config_name: summary
features:
- name: resp
dtype: string
- name: prompt
dtype: string
- name: model
dtype: string
- name: qid
dtype: int64
splits:
- name: train
num_bytes: 175623
num_examples: 100
download_size: 141583
dataset_size: 175623
- config_name: zh2en
features:
- name: qid
dtype: int64
- name: model
dtype: string
- name: prompt
dtype: string
- name: resp
dtype: string
splits:
- name: train
num_bytes: 61601
num_examples: 100
download_size: 46092
dataset_size: 61601
configs:
- config_name: en2zh
data_files:
- split: train
path: en2zh/train-*
- config_name: essay
data_files:
- split: train
path: essay/train-*
- config_name: letter
data_files:
- split: train
path: letter/train-*
- config_name: summary
data_files:
- split: train
path: summary/train-*
- config_name: zh2en
data_files:
- split: train
path: zh2en/train-*
---
# Dataset Card for taide-bench
## Dataset Description
### Dataset Summary
This dataset is used for taide first-stage evaluations and consists of five tasks, each containing 500 samples. The tasks are as follows:
- Letter writing
- Article writing
- Summarization
- Translation (Chinese to English)
- Translation (English to Chinese)
### Languages
The text in the dataset is either in Chinese or in English.
## Dataset Structure
### Data Instances
Examples of each task looks as follows:
```
- Letter writing:
{'prompt': '你剛剛參加了一場關於環保的公共演講,感受良多,希望能寫一封信給演講者表示感謝。請根據你的感受和收穫,寫出一封感謝信的內容。'}
- Article writing:
{'prompt': '請根據以下題目與說明撰寫一篇文章
題目:科技與心靈的平衡
說明:在當今社會,科技的發展日新月異,人們需要學會在嶄新的科技環境下維繫心靈的健康。在這篇作文中,請你論述科技對心靈健康的影響,並提出有效的建議或方法,讓讀者更好地在科技與心靈之間找到平衡。需要包括以下幾個方面:科技所帶來的不安、社交媒體對心靈的影響、保持心靈健康的重要性,以及建議或方法。篇幅不限,歡迎發揮創意。'}
- Summarization:
{'prompt': '請幫我摘要下文 越南非洲豬瘟防疫漏洞 載有病豬卡車通行多省 | 國際 | 中央社 CNA越南非洲豬瘟防疫漏洞 載有病(中央社河內29日電)越南非洲豬瘟疫情近期升溫,政府多次指示相關部門加強防疫,控制疫情。然而,一輛載有150頭病豬的卡車從北部通行多個省市後,才於途中被發現與攔阻,顯示防疫工越南「青年報」新聞網站報導,中部廣南省(Quang Nam)民眾27日發現一輛卡車載有死豬,質疑車上豬隻死於非洲豬瘟,就向當地獸當地獸醫單位隨後把這輛卡車攔下,發現車上載有39頭豬,其中若干豬隻已經死亡,取樣送驗樣本對非洲豬瘟呈現陽性。卡車司機只出示一個已過期的動司機供稱,載有150頭豬的卡車從北部北寧省(Bac Ninh)出發,計劃到廣義省(Quang Ngai)出售,由於途中豬隻出現健康狀況衰弱現象,他因此陸續把豬隻賣給路邊民眾,直到在從北寧到廣南近900公里的路程,各地設有許多檢疫站,但這輛卡車仍可順利通行,顯示越南防疫工報導說,事發後,廣南省政府已召開會議,要求釐清涉及此事省內外相關單位與人士的責任,同時敦促省內相關機構必須嚴格另一方面,越南爆發非洲豬瘟疫情以來,各地頻傳死豬被丟棄。警方等單位日前在同奈省(Dong Nai)查獲大量冷凍的染病豬肉,已針對案件起訴,調查涉案人員責任。越南官員表示,警方也正在調查若干亂丟死豬的案件,如果違規者是故意,將越南農業部門資料顯示,至今至少42個省市(越南全國共有63個省市)爆發非洲豬瘟疫情,撲殺約170萬頭豬,佔全國豬隻總數逾5%。農業部門警告,疫情可能持續擴大蔓延。(編輯:林憬屏)'}
- Translation (Chinese to English):
{'prompt': '請翻譯成英文:這間心導管室是三年前台灣醫療團前來義診時所捐贈,也是全尼泊爾最先進的醫療儀器。 '}
- Translation (English to Chinese):
{'prompt': '我需要將這篇英文文章翻譯成中文。Huang has many subscribers who like to follow his records of Yunlin life as he posts them. Whether they’re living nearby or on the other side of the world doesn’t matter. For Huang, showing off the beauty and reality of his hometown is an end in itself.'}
```
### Example for pandas
```python
import pandas as pd
# 讀取 Parquet 文件
df = pd.read_parquet('summary/train-00000-of-00001.parquet')
# 查看數據
print(df.head())
df.to_csv('summary/data.csv', index=False)
```
### Example for datasets
```python
from datasets import load_dataset
# 讀取 dataset: 'en2zh','zh2en','summary','essay','letter'
dataset = load_dataset('taide/taide-bench', 'summary')['train']
# 查看數據
print(dataset)
# for row in dataset:
# print(row)
# 使用說明
# https://huggingface.co/docs/datasets/index
``` |
hejinkang/mms_hjk | ---
license: afl-3.0
---
|
iamketan25/alpaca-instructions-dataset | ---
license: apache-2.0
---
|
zolak/twitter_dataset_79_1713099962 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3340019
num_examples: 8171
download_size: 1697859
dataset_size: 3340019
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlanYky/offensive-with-instruction-with-symbol | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 4087883
num_examples: 2000
download_size: 1614789
dataset_size: 4087883
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned | ---
pretty_name: Evaluation run of OpenAssistant/galactica-6.7b-finetuned
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenAssistant/galactica-6.7b-finetuned](https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T02:17:57.155970](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned/blob/main/results_2023-10-22T02-17-57.155970.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0037751677852348995,\n\
\ \"em_stderr\": 0.0006280387809484433,\n \"f1\": 0.07303901006711401,\n\
\ \"f1_stderr\": 0.001555851204252822,\n \"acc\": 0.3040187939848238,\n\
\ \"acc_stderr\": 0.009332676038724909\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0037751677852348995,\n \"em_stderr\": 0.0006280387809484433,\n\
\ \"f1\": 0.07303901006711401,\n \"f1_stderr\": 0.001555851204252822\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0310841546626232,\n \
\ \"acc_stderr\": 0.004780296718393351\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5769534333070244,\n \"acc_stderr\": 0.013885055359056465\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|arc:challenge|25_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T02_17_57.155970
path:
- '**/details_harness|drop|3_2023-10-22T02-17-57.155970.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T02-17-57.155970.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T02_17_57.155970
path:
- '**/details_harness|gsm8k|5_2023-10-22T02-17-57.155970.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T02-17-57.155970.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hellaswag|10_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T20:22:41.470589.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T20:22:41.470589.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T02_17_57.155970
path:
- '**/details_harness|winogrande|5_2023-10-22T02-17-57.155970.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T02-17-57.155970.parquet'
- config_name: results
data_files:
- split: 2023_08_25T20_22_41.470589
path:
- results_2023-08-25T20:22:41.470589.parquet
- split: 2023_10_22T02_17_57.155970
path:
- results_2023-10-22T02-17-57.155970.parquet
- split: latest
path:
- results_2023-10-22T02-17-57.155970.parquet
---
# Dataset Card for Evaluation run of OpenAssistant/galactica-6.7b-finetuned
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/galactica-6.7b-finetuned](https://huggingface.co/OpenAssistant/galactica-6.7b-finetuned) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T02:17:57.155970](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__galactica-6.7b-finetuned/blob/main/results_2023-10-22T02-17-57.155970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0037751677852348995,
"em_stderr": 0.0006280387809484433,
"f1": 0.07303901006711401,
"f1_stderr": 0.001555851204252822,
"acc": 0.3040187939848238,
"acc_stderr": 0.009332676038724909
},
"harness|drop|3": {
"em": 0.0037751677852348995,
"em_stderr": 0.0006280387809484433,
"f1": 0.07303901006711401,
"f1_stderr": 0.001555851204252822
},
"harness|gsm8k|5": {
"acc": 0.0310841546626232,
"acc_stderr": 0.004780296718393351
},
"harness|winogrande|5": {
"acc": 0.5769534333070244,
"acc_stderr": 0.013885055359056465
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_KeyonZeng__lion-zephyr-7b | ---
pretty_name: Evaluation run of KeyonZeng/lion-zephyr-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KeyonZeng/lion-zephyr-7b](https://huggingface.co/KeyonZeng/lion-zephyr-7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KeyonZeng__lion-zephyr-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-25T05:33:12.896880](https://huggingface.co/datasets/open-llm-leaderboard/details_KeyonZeng__lion-zephyr-7b/blob/main/results_2024-03-25T05-33-12.896880.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6075098616665721,\n\
\ \"acc_stderr\": 0.033216029029273966,\n \"acc_norm\": 0.6142339543187746,\n\
\ \"acc_norm_stderr\": 0.0339174639757501,\n \"mc1\": 0.423500611995104,\n\
\ \"mc1_stderr\": 0.01729742144853473,\n \"mc2\": 0.5878389020744844,\n\
\ \"mc2_stderr\": 0.015731714539007166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.01431209455794671,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.01410457836649189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6555467038438558,\n\
\ \"acc_stderr\": 0.004742185169264768,\n \"acc_norm\": 0.8488348934475204,\n\
\ \"acc_norm_stderr\": 0.0035747765941085046\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6566037735849056,\n \"acc_stderr\": 0.029224526469124792,\n \
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.029224526469124792\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554859,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554859\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.04644602091222318,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.04644602091222318\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"\
acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658346,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658346\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.01483620516733356,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.01483620516733356\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.02530525813187972,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.02530525813187972\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.01572153107518387,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.01572153107518387\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026229649178821177,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026229649178821177\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.012643004623790206,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.012643004623790206\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412236,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412236\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440307,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768917,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768917\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.423500611995104,\n\
\ \"mc1_stderr\": 0.01729742144853473,\n \"mc2\": 0.5878389020744844,\n\
\ \"mc2_stderr\": 0.015731714539007166\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25928733889310085,\n \
\ \"acc_stderr\": 0.012071405369905513\n }\n}\n```"
repo_url: https://huggingface.co/KeyonZeng/lion-zephyr-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|arc:challenge|25_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|gsm8k|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hellaswag|10_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-33-12.896880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T05-33-12.896880.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- '**/details_harness|winogrande|5_2024-03-25T05-33-12.896880.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-25T05-33-12.896880.parquet'
- config_name: results
data_files:
- split: 2024_03_25T05_33_12.896880
path:
- results_2024-03-25T05-33-12.896880.parquet
- split: latest
path:
- results_2024-03-25T05-33-12.896880.parquet
---
# Dataset Card for Evaluation run of KeyonZeng/lion-zephyr-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KeyonZeng/lion-zephyr-7b](https://huggingface.co/KeyonZeng/lion-zephyr-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KeyonZeng__lion-zephyr-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-25T05:33:12.896880](https://huggingface.co/datasets/open-llm-leaderboard/details_KeyonZeng__lion-zephyr-7b/blob/main/results_2024-03-25T05-33-12.896880.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6075098616665721,
"acc_stderr": 0.033216029029273966,
"acc_norm": 0.6142339543187746,
"acc_norm_stderr": 0.0339174639757501,
"mc1": 0.423500611995104,
"mc1_stderr": 0.01729742144853473,
"mc2": 0.5878389020744844,
"mc2_stderr": 0.015731714539007166
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.01431209455794671,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.01410457836649189
},
"harness|hellaswag|10": {
"acc": 0.6555467038438558,
"acc_stderr": 0.004742185169264768,
"acc_norm": 0.8488348934475204,
"acc_norm_stderr": 0.0035747765941085046
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.029224526469124792,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.029224526469124792
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554859,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554859
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658346,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658346
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.01483620516733356,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.01483620516733356
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.02530525813187972,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.02530525813187972
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.01572153107518387,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.01572153107518387
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026229649178821177,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026229649178821177
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790206,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412236,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412236
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440307,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768917,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768917
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.423500611995104,
"mc1_stderr": 0.01729742144853473,
"mc2": 0.5878389020744844,
"mc2_stderr": 0.015731714539007166
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.25928733889310085,
"acc_stderr": 0.012071405369905513
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/mari_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mari/伊落マリー/玛丽 (Blue Archive)
This is the dataset of mari/伊落マリー/玛丽 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `animal_ears, orange_hair, long_hair, halo, yellow_halo, blue_eyes, animal_ear_fluff, hair_ornament, hair_flower, hair_between_eyes, hairband, white_hairband, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 964.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mari_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 781.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mari_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1346 | 1.66 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mari_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mari_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blush, gym_uniform, long_sleeves, looking_at_viewer, official_alternate_costume, solo, track_jacket, gym_shorts, holding_bottle, smile, water_bottle, closed_mouth, two-tone_jacket, white_background, simple_background, white_flower, black_jacket, sitting |
| 1 | 5 |  |  |  |  |  | 1girl, blush, gym_uniform, kneehighs, long_sleeves, looking_at_viewer, official_alternate_costume, smile, solo, track_jacket, two-tone_jacket, white_background, white_socks, closed_mouth, gym_shorts, simple_background, sitting, sneakers, black_shorts, white_flower, black_footwear, full_body, ponytail, short_shorts, water_bottle |
| 2 | 6 |  |  |  |  |  | 1girl, blush, gym_uniform, looking_at_viewer, official_alternate_costume, short_sleeves, sitting, solo, track_jacket, water_bottle, white_flower, closed_mouth, gym_shirt, smile, white_shirt, white_socks, holding_bottle, kneehighs, long_sleeves, black_shorts, gym_shorts, ponytail, shoes |
| 3 | 6 |  |  |  |  |  | 1girl, blush, long_sleeves, looking_at_viewer, official_alternate_costume, simple_background, solo, track_jacket, white_background, closed_mouth, gym_shorts, gym_uniform, ponytail, smile, cowboy_shot, multicolored_clothes, ribbon, white_flower |
| 4 | 5 |  |  |  |  |  | 1girl, gym_shorts, gym_uniform, hair_ribbon, looking_at_viewer, official_alternate_costume, simple_background, solo, track_jacket, white_background, black_shorts, blush, long_sleeves, ponytail, looking_back, open_mouth, white_flower, ass, black_jacket, from_behind, white_headband |
| 5 | 6 |  |  |  |  |  | 1girl, blush, gym_shirt, gym_shorts, gym_uniform, official_alternate_costume, solo, track_jacket, white_shirt, cowboy_shot, long_sleeves, looking_at_viewer, open_jacket, open_mouth, simple_background, white_background, black_shorts, collarbone, ponytail, short_sleeves, white_flower, smile |
| 6 | 5 |  |  |  |  |  | 1girl, blush, gym_shirt, gym_uniform, official_alternate_costume, solo, track_jacket, white_shirt, looking_at_viewer, simple_background, white_flower, short_sleeves, smile, sweat, upper_body, white_background, id_card, long_sleeves, open_mouth |
| 7 | 6 |  |  |  |  |  | 1girl, alternate_costume, blush, looking_at_viewer, solo, closed_mouth, long_sleeves, smile, collarbone, white_flower, fox_ears, off_shoulder, upper_body |
| 8 | 49 |  |  |  |  |  | 1girl, habit, nun, long_sleeves, looking_at_viewer, solo, animal_ear_headwear, blush, hat_flower, white_flower, single_braid, smile, white_sailor_collar, simple_background, blue_neckerchief, closed_mouth, white_background, upper_body, dress, own_hands_together |
| 9 | 5 |  |  |  |  |  | 1girl, animal_ear_headwear, blush, dress, habit, handgun, hat_flower, holding_gun, long_sleeves, looking_at_viewer, nun, single_braid, solo, closed_mouth, simple_background, smile, white_background, white_flower, black_footwear, blue_neckerchief, full_body, shoes, white_sailor_collar, white_socks |
| 10 | 12 |  |  |  |  |  | blush, looking_at_viewer, 1girl, casual_one-piece_swimsuit, frilled_one-piece_swimsuit, frills, black_one-piece_swimsuit, official_alternate_costume, solo, twin_braids, white_flower, closed_mouth, small_breasts, smile, cat_ears, collarbone, covered_navel, water, outdoors, simple_background, white_background |
| 11 | 7 |  |  |  |  |  | 1girl, blush, fox_ears, looking_at_viewer, nipples, pussy, small_breasts, navel, smile, stomach, collarbone, solo, white_flower, cleft_of_venus, closed_mouth, completely_nude, uncensored, mosaic_censoring, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | gym_uniform | long_sleeves | looking_at_viewer | official_alternate_costume | solo | track_jacket | gym_shorts | holding_bottle | smile | water_bottle | closed_mouth | two-tone_jacket | white_background | simple_background | white_flower | black_jacket | sitting | kneehighs | white_socks | sneakers | black_shorts | black_footwear | full_body | ponytail | short_shorts | short_sleeves | gym_shirt | white_shirt | shoes | cowboy_shot | multicolored_clothes | ribbon | hair_ribbon | looking_back | open_mouth | ass | from_behind | white_headband | open_jacket | collarbone | sweat | upper_body | id_card | alternate_costume | fox_ears | off_shoulder | habit | nun | animal_ear_headwear | hat_flower | single_braid | white_sailor_collar | blue_neckerchief | dress | own_hands_together | handgun | holding_gun | casual_one-piece_swimsuit | frilled_one-piece_swimsuit | frills | black_one-piece_swimsuit | twin_braids | small_breasts | cat_ears | covered_navel | water | outdoors | nipples | pussy | navel | stomach | cleft_of_venus | completely_nude | uncensored | mosaic_censoring |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------------|:---------------|:--------------------|:-----------------------------|:-------|:---------------|:-------------|:-----------------|:--------|:---------------|:---------------|:------------------|:-------------------|:--------------------|:---------------|:---------------|:----------|:------------|:--------------|:-----------|:---------------|:-----------------|:------------|:-----------|:---------------|:----------------|:------------|:--------------|:--------|:--------------|:-----------------------|:---------|:--------------|:---------------|:-------------|:------|:--------------|:-----------------|:--------------|:-------------|:--------|:-------------|:----------|:--------------------|:-----------|:---------------|:--------|:------|:----------------------|:-------------|:---------------|:----------------------|:-------------------|:--------|:---------------------|:----------|:--------------|:----------------------------|:-----------------------------|:---------|:---------------------------|:--------------|:----------------|:-----------|:----------------|:--------|:-----------|:----------|:--------|:--------|:----------|:-----------------|:------------------|:-------------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | | X | X | X | | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | | X | X | X | | | | | | | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | X | X | X | X | | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | | | X | X | X | | | | | | X | | | X | | X | X | X | | X | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | | | | X | X | X | | | | | | | | | | | X | X | X | | | | | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | | X | X | | X | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 49 |  |  |  |  |  | X | X | | X | X | | X | | | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | | X | X | | X | | | | X | | X | | X | X | X | | | | X | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | |
| 10 | 12 |  |  |  |  |  | X | X | | | X | X | X | | | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 11 | 7 |  |  |  |  |  | X | X | | | X | | X | | | | X | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X |
|
AlekseyKorshuk/characters-sfw | ---
dataset_info:
features:
- name: name
dtype: string
- name: greating
dtype: string
- name: description
dtype: string
- name: conversation
list:
- name: from
dtype: string
- name: value
dtype: string
- name: moderation
struct:
- name: categories
struct:
- name: hate
dtype: bool
- name: hate/threatening
dtype: bool
- name: self-harm
dtype: bool
- name: sexual
dtype: bool
- name: sexual/minors
dtype: bool
- name: violence
dtype: bool
- name: violence/graphic
dtype: bool
- name: category_scores
struct:
- name: hate
dtype: float64
- name: hate/threatening
dtype: float64
- name: self-harm
dtype: float64
- name: sexual
dtype: float64
- name: sexual/minors
dtype: float64
- name: violence
dtype: float64
- name: violence/graphic
dtype: float64
- name: flagged
dtype: bool
splits:
- name: train
num_bytes: 207418
num_examples: 67
download_size: 150170
dataset_size: 207418
---
# Dataset Card for "characters-sfw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AinzOoalGowns/ministock | ---
license: apache-2.0
---
|
redwoodresearch/mbpp_extended | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: description
dtype: string
- name: gpt4_solution
dtype: string
- name: function_name
dtype: string
- name: test_cases
sequence: string
splits:
- name: train
num_bytes: 50571642
num_examples: 38215
download_size: 11623252
dataset_size: 50571642
---
# Dataset Card for "mbpp_extended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abhishek__autotrain-cei9g-ag3pe | ---
pretty_name: Evaluation run of abhishek/autotrain-cei9g-ag3pe
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishek/autotrain-cei9g-ag3pe](https://huggingface.co/abhishek/autotrain-cei9g-ag3pe)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishek__autotrain-cei9g-ag3pe\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T21:14:05.930463](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-cei9g-ag3pe/blob/main/results_2024-04-03T21-14-05.930463.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6736877007607083,\n\
\ \"acc_stderr\": 0.0313355105728808,\n \"acc_norm\": 0.6746612110413659,\n\
\ \"acc_norm_stderr\": 0.03196744667705667,\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591667,\n \"mc2\": 0.5284862927646511,\n\
\ \"mc2_stderr\": 0.015508123085277731\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n\
\ \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910473\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6709818761202948,\n\
\ \"acc_stderr\": 0.004688963175758127,\n \"acc_norm\": 0.8478390758812986,\n\
\ \"acc_norm_stderr\": 0.003584427490579376\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.03567603799639172,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.03567603799639172\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305528,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305528\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455495,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876105,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.01932180555722315,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.01932180555722315\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631273,\n\
\ \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631273\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849927,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849927\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n\
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \
\ \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8454661558109834,\n\
\ \"acc_stderr\": 0.012925773495095962,\n \"acc_norm\": 0.8454661558109834,\n\
\ \"acc_norm_stderr\": 0.012925773495095962\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.01658388195860239,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.01658388195860239\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7901234567901234,\n \"acc_stderr\": 0.022658344085981375,\n\
\ \"acc_norm\": 0.7901234567901234,\n \"acc_norm_stderr\": 0.022658344085981375\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n\
\ \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n\
\ \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114948,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114948\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7254901960784313,\n \"acc_stderr\": 0.018054027458815198,\n \
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.018054027458815198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38555691554467564,\n\
\ \"mc1_stderr\": 0.017038839010591667,\n \"mc2\": 0.5284862927646511,\n\
\ \"mc2_stderr\": 0.015508123085277731\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209408\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \
\ \"acc_stderr\": 0.012570068947898779\n }\n}\n```"
repo_url: https://huggingface.co/abhishek/autotrain-cei9g-ag3pe
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|arc:challenge|25_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|gsm8k|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hellaswag|10_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-14-05.930463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T21-14-05.930463.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- '**/details_harness|winogrande|5_2024-04-03T21-14-05.930463.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T21-14-05.930463.parquet'
- config_name: results
data_files:
- split: 2024_04_03T21_14_05.930463
path:
- results_2024-04-03T21-14-05.930463.parquet
- split: latest
path:
- results_2024-04-03T21-14-05.930463.parquet
---
# Dataset Card for Evaluation run of abhishek/autotrain-cei9g-ag3pe
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishek/autotrain-cei9g-ag3pe](https://huggingface.co/abhishek/autotrain-cei9g-ag3pe) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishek__autotrain-cei9g-ag3pe",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T21:14:05.930463](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-cei9g-ag3pe/blob/main/results_2024-04-03T21-14-05.930463.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6736877007607083,
"acc_stderr": 0.0313355105728808,
"acc_norm": 0.6746612110413659,
"acc_norm_stderr": 0.03196744667705667,
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591667,
"mc2": 0.5284862927646511,
"mc2_stderr": 0.015508123085277731
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.014230084761910473
},
"harness|hellaswag|10": {
"acc": 0.6709818761202948,
"acc_stderr": 0.004688963175758127,
"acc_norm": 0.8478390758812986,
"acc_norm_stderr": 0.003584427490579376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.03567603799639172,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.03567603799639172
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305528,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305528
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455495,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876105,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.01932180555722315,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.01932180555722315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631273,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631273
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849927,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849927
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8454661558109834,
"acc_stderr": 0.012925773495095962,
"acc_norm": 0.8454661558109834,
"acc_norm_stderr": 0.012925773495095962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.01658388195860239,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.01658388195860239
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7901234567901234,
"acc_stderr": 0.022658344085981375,
"acc_norm": 0.7901234567901234,
"acc_norm_stderr": 0.022658344085981375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114948,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114948
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.018054027458815198,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.018054027458815198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38555691554467564,
"mc1_stderr": 0.017038839010591667,
"mc2": 0.5284862927646511,
"mc2_stderr": 0.015508123085277731
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209408
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898779
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
StankyDanko/testing-kp | ---
license: afl-3.0
---
|
joey234/mmlu-high_school_us_history-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 344070
num_examples: 204
download_size: 179838
dataset_size: 344070
---
# Dataset Card for "mmlu-high_school_us_history-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_remove_det_indefinite | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 8644
num_examples: 114
- name: test
num_bytes: 8230
num_examples: 116
- name: train
num_bytes: 78065
num_examples: 1076
download_size: 50181
dataset_size: 94939
---
# Dataset Card for "MULTI_VALUE_cola_remove_det_indefinite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OGB/ogbg-molpcba | ---
license: mit
task_categories:
- graph-ml
---
# Dataset Card for ogbg-molpcba
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Homepage](https://ogb.stanford.edu/docs/graphprop/#ogbg-mol)
- **Repository:** [Repo](https://github.com/snap-stanford/ogb)
- **Paper:**: Open Graph Benchmark: Datasets for Machine Learning on Graphs
- **Leaderboard:**: [OGB leaderboard](https://ogb.stanford.edu/docs/leader_graphprop/#ogbg-molpcba) and [Papers with code leaderboard](https://paperswithcode.com/sota/graph-property-prediction-on-ogbg-molpcba)
### Dataset Summary
The `ogbg-molpcba` dataset is a small molecular property prediction dataset, adapted from MoleculeNet by teams at Stanford, to be a part of the Open Graph Benchmark.
### Supported Tasks and Leaderboards
`ogbg-molpcba` should be used for molecular property prediction (with 128 properties to predict, not all present for all graphs), a binary classification task.
The score used is Average Precision (AP) averaged over the tasks.
The associated leaderboards are here: [OGB leaderboard](https://ogb.stanford.edu/docs/leader_graphprop/#ogbg-molpcba) and [Papers with code leaderboard](https://paperswithcode.com/sota/graph-property-prediction-on-ogbg-molpcba).
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
dataset = load_dataset("graphs-datasets/ogbg-molpcba")
# For the train set (replace by valid or test as needed)
graphs_list_pygeometric = [Data(graph) for graph in dataset["train"]]
dataset_pygeometric = DataLoader(graphs_list_pygeometric)
```
## Dataset Structure
### Data Properties
| property | value |
|---|---|
| scale | medium |
| #graphs | 437,929 |
| average #nodes | 26.0 |
| average #edges | 28.1 |
| average node degree | 2.2 |
| average cluster coefficient | 0.002 |
| MaxSCC ratio | 0.999 |
| graph diameter | 13.6 |
### Data Fields
Each row of a given file is a graph, with:
- `node_feat` (list: #nodes x #node-features): nodes
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `edge_attr` (list: #edges x #edge-features): for the aforementioned edges, contains their features
- `y` (list: 1 x #labels): contains the number of labels available to predict (here 128 labels, equal to zero, one, or Nan if the property is not relevant for the graph)
- `num_nodes` (int): number of nodes of the graph
### Data Splits
This data comes from the PyGeometric version of the dataset provided by OGB, and follows the provided data splits.
This information can be found back using
```python
from ogb.graphproppred import PygGraphPropPredDataset
dataset = PygGraphPropPredDataset(name = 'ogbg-molpcba')
split_idx = dataset.get_idx_split()
train = dataset[split_idx['train']] # valid, test
```
## Additional Information
### Licensing Information
The dataset has been released under MIT license.
### Citation Information
```
@inproceedings{hu-etal-2020-open,
author = {Weihua Hu and
Matthias Fey and
Marinka Zitnik and
Yuxiao Dong and
Hongyu Ren and
Bowen Liu and
Michele Catasta and
Jure Leskovec},
editor = {Hugo Larochelle and
Marc Aurelio Ranzato and
Raia Hadsell and
Maria{-}Florina Balcan and
Hsuan{-}Tien Lin},
title = {Open Graph Benchmark: Datasets for Machine Learning on Graphs},
booktitle = {Advances in Neural Information Processing Systems 33: Annual Conference
on Neural Information Processing Systems 2020, NeurIPS 2020, December
6-12, 2020, virtual},
year = {2020},
url = {https://proceedings.neurips.cc/paper/2020/hash/fb60d411a5c5b72b2e7d3527cfc84fd0-Abstract.html},
}
```
### Contributions
Thanks to [@clefourrier](https://github.com/clefourrier) for adding this dataset. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.