datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/ran_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ran/ラン (Pokémon)
This is the dataset of ran/ラン (Pokémon), containing 71 images and their tags.
The core tags of this character are `black_hair, hair_bun, single_hair_bun, blue_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 71 | 38.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 71 | 29.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 92 | 45.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 71 | 36.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 92 | 55.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ran_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, open_mouth, looking_at_viewer, :d, hair_ribbon, pants, solo, blue_hair, full_body, long_sleeves, shoes, sidelocks, star_(symbol) |
| 1 | 9 |  |  |  |  |  | long_sleeves, open_mouth, 1girl, :d, blue_jacket, sidelocks, star_(symbol), tongue, 1boy, black_eyes, blue_pants, grey_eyes, hair_ribbon, pokemon_(creature), short_hair, solo |
| 2 | 6 |  |  |  |  |  | closed_mouth, outdoors, pants, short_hair, smile, 1boy, black_eyes, day, male_focus, pokemon_(creature), 1girl, looking_at_viewer, sitting, sky, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | looking_at_viewer | :d | hair_ribbon | pants | solo | blue_hair | full_body | long_sleeves | shoes | sidelocks | star_(symbol) | blue_jacket | tongue | 1boy | black_eyes | blue_pants | grey_eyes | pokemon_(creature) | short_hair | closed_mouth | outdoors | smile | day | male_focus | sitting | sky | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-----|:--------------|:--------|:-------|:------------|:------------|:---------------|:--------|:------------|:----------------|:--------------|:---------|:-------|:-------------|:-------------|:------------|:---------------------|:-------------|:---------------|:-----------|:--------|:------|:-------------|:----------|:------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | | X | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X |
|
Nerfgun3/barbosa_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/barbosa_style/resolve/main/barbosa_showcase.png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Barbosa Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/barbosa_style/resolve/main/barbosa_showcase.png"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"barbosa_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(barbosa_style:0.8)"```
I trained the embedding two epochs until 8000 steps.
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
SkyWR/Othavio | ---
license: openrail
---
|
vwxyzjn/openhermes-dev__mistralai_Mixtral-8x7B-Instruct-v0.1__1706886961 | ---
dataset_info:
features:
- name: source
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: title
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: idx
dtype: 'null'
- name: id
dtype: 'null'
- name: model
dtype: 'null'
- name: topic
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: model_name
dtype: 'null'
- name: language
dtype: 'null'
- name: views
dtype: 'null'
- name: hash
dtype: 'null'
- name: category
dtype: string
- name: prompt
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 184962
num_examples: 23
- name: test_prefs
num_bytes: 1818
num_examples: 1
download_size: 194962
dataset_size: 186780
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
PhaniManda/autotrain-data-test-auto | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: test-auto
## Dataset Description
This dataset has been automatically processed by AutoTrain for project test-auto.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "I'm indifferent towards this restaurant. The food was average, and the service was neither exceptional nor terrible.",
"target": 1
},
{
"text": "\"The product I received was damaged and didn't work properly. I reached out to customer support, but they were unhelpful and unresponsive.\"",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['Negative', 'Neutral', 'Positive'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 26 |
| valid | 8 |
|
TurtleLiu/MentalLLama_DR_300 | ---
license: apache-2.0
---
|
AIRI-Institute/I4TALK_DATA | ---
license: cc-by-sa-4.0
---
|
adambuttrick/360K-funding-statement-sentences-name-identifier | ---
license: cc0-1.0
---
|
rehanbrr/gender-DEI-data | ---
dataset_info:
features:
- name: doi
dtype: string
- name: id
dtype: string
- name: title
dtype: string
- name: chunk_id
dtype: string
- name: chunk
dtype: string
splits:
- name: train
num_bytes: 235089
num_examples: 156
download_size: 130544
dataset_size: 235089
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gender-DEI-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai-bites/databricks-mini | ---
license: mit
---
This is a subset of the databricks 15k dataset `databricks/databricks-dolly-15k` used for finetuning Google's Gemma model `google/gemma-2b`.
This version has only those records without context to match the dataset used in the fine-tuning Keras example from Google.
|
gaianet/london | ---
license: apache-2.0
---
|
bigbio/chemprot |
---
language:
- en
bigbio_language:
- English
license: other
multilinguality: monolingual
bigbio_license_shortname: PUBLIC_DOMAIN_MARK_1p0
pretty_name: ChemProt
homepage: https://biocreative.bioinformatics.udel.edu/tasks/biocreative-vi/track-5/
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- RELATION_EXTRACTION
- NAMED_ENTITY_RECOGNITION
---
# Dataset Card for ChemProt
## Dataset Description
- **Homepage:** https://biocreative.bioinformatics.udel.edu/tasks/biocreative-vi/track-5/
- **Pubmed:** True
- **Public:** True
- **Tasks:** RE,NER
The BioCreative VI Chemical-Protein interaction dataset identifies entities of
chemicals and proteins and their likely relation to one other. Compounds are
generally agonists (activators) or antagonists (inhibitors) of proteins.
## Citation Information
```
@article{DBLP:journals/biodb/LiSJSWLDMWL16,
author = {Krallinger, M., Rabal, O., Lourenço, A.},
title = {Overview of the BioCreative VI chemical-protein interaction Track},
journal = {Proceedings of the BioCreative VI Workshop,},
volume = {141-146},
year = {2017},
url = {https://biocreative.bioinformatics.udel.edu/tasks/biocreative-vi/track-5/},
doi = {},
biburl = {},
bibsource = {}
}
```
|
HannahKniesel/ade20k_gt | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1087314645.2
num_examples: 20210
download_size: 904740489
dataset_size: 1087314645.2
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wuming156/yu | ---
license: unknown
---
|
leo214gamer/satono | ---
license: openrail
---
|
breadlicker45/Calorie-dataset | ---
license: other
---
the api I used to get the Calories may be messed up. |
ZurichNLP/rsd-ists-2016 | ---
license: cc-by-sa-4.0
language_creators:
- machine-generated
dataset_info:
features:
- name: tokens_a
sequence: string
- name: tokens_b
sequence: string
- name: labels_a
sequence: float64
- name: labels_b
sequence: float64
- name: lang_a
dtype: string
- name: lang_b
dtype: string
- name: subset
dtype: string
- name: id
dtype: string
- name: alignments
dtype: string
splits:
- name: train_en
num_bytes: 1640900
num_examples: 1506
- name: train_de
num_bytes: 1101404
num_examples: 3012
- name: train_es
num_bytes: 1154765
num_examples: 3012
- name: train_fr
num_bytes: 1206414
num_examples: 3012
- name: train_ja
num_bytes: 838252
num_examples: 3012
- name: train_ko
num_bytes: 829328
num_examples: 3012
- name: train_zh
num_bytes: 796140
num_examples: 3012
- name: test_en
num_bytes: 833900
num_examples: 750
- name: test_de
num_bytes: 558624
num_examples: 1500
- name: test_es
num_bytes: 580224
num_examples: 1500
- name: test_fr
num_bytes: 610017
num_examples: 1500
- name: test_ja
num_bytes: 425912
num_examples: 1500
- name: test_ko
num_bytes: 424407
num_examples: 1500
- name: test_zh
num_bytes: 403680
num_examples: 1500
download_size: 2569205
dataset_size: 11403967
task_categories:
- token-classification
language:
- en
- de
- es
- fr
- ja
- ko
- zh
size_categories:
- 1K<n<10K
---
Training and test data for the task of Recognizing Semantic Differences (RSD).
[See the paper](https://arxiv.org/abs/2305.13303) for details on how the dataset was created, and see our code at https://github.com/ZurichNLP/recognizing-semantic-differences for an example of how to use the data for evaluation.
The data are derived from the [SemEval-2016 Task 2 for Interpretable Semantic Textual Similarity](https://alt.qcri.org/semeval2016/task2/) organized by [Agirre et al. (2016)](http://dx.doi.org/10.18653/v1/S16-1082).
The original URLs of the data are:
* Train: http://alt.qcri.org/semeval2016/task2/data/uploads/train_2015_10_22.utf-8.tar.gz
* Test: http://alt.qcri.org/semeval2016/task2/data/uploads/test_goldstandard.tar.gz
The translations into non-English languages have been created using machine translation (DeepL).
## Citation
```bibtex
@inproceedings{vamvas-sennrich-2023-rsd,
title={Towards Unsupervised Recognition of Token-level Semantic Differences in Related Documents},
author={Jannis Vamvas and Rico Sennrich},
month = dec,
year = "2023",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
address = "Singapore",
publisher = "Association for Computational Linguistics",
}
```
|
jlbaker361/little_dataset-combined | ---
dataset_info:
features:
- name: image
dtype: image
- name: src
dtype: string
- name: split
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: caption
dtype: string
splits:
- name: train
num_bytes: 3528300.0
num_examples: 10
download_size: 355277
dataset_size: 3528300.0
---
# Dataset Card for "little_dataset-combined"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chuyin0321/eps-trend-stocks | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: string
- name: current_qtr
dtype: string
- name: current_estimate_current_qtr
dtype: float64
- name: next_qtr
dtype: string
- name: current_estimate_next_qtr
dtype: float64
- name: current_year
dtype: int64
- name: current_estimate_current_year
dtype: float64
- name: next_year
dtype: int64
- name: current_estimate_next_year
dtype: float64
- name: 7_days_ago_current_qtr
dtype: float64
- name: 7_days_ago_next_qtr
dtype: float64
- name: 7_days_ago_current_year
dtype: float64
- name: 7_days_ago_next_year
dtype: float64
- name: 30_days_ago_current_qtr
dtype: float64
- name: 30_days_ago_next_qtr
dtype: float64
- name: 30_days_ago_current_year
dtype: float64
- name: 30_days_ago_next_year
dtype: float64
- name: 60_days_ago_current_qtr
dtype: float64
- name: 60_days_ago_next_qtr
dtype: float64
- name: 60_days_ago_current_year
dtype: float64
- name: 60_days_ago_next_year
dtype: float64
- name: 90_days_ago_current_qtr
dtype: float64
- name: 90_days_ago_next_qtr
dtype: float64
- name: 90_days_ago_current_year
dtype: float64
- name: 90_days_ago_next_year
dtype: float64
splits:
- name: train
num_bytes: 300316
num_examples: 1356
download_size: 140021
dataset_size: 300316
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eps-trend-stocks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zorilladev/ner_train_judgement_temp | ---
language:
- en
--- |
heliosprime/twitter_dataset_1713187325 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19949
num_examples: 54
download_size: 19938
dataset_size: 19949
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713187325"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EgilKarlsen/CSIC_RoBERTa_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115621182
num_examples: 37500
- name: test
num_bytes: 38540387
num_examples: 12500
download_size: 211875916
dataset_size: 154161569
---
# Dataset Card for "CSIC_RoBERTa_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v1-math-6c03d1-1913164906 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v1
eval_info:
task: text_zero_shot_classification
model: facebook/opt-6.7b
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v1
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v1
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-6.7b
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v1
* Config: mathemakitten--winobias_antistereotype_test_cot_v1
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-dpo | ---
pretty_name: Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheTravellingEngineer/llama2-7b-chat-hf-dpo](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T15:24:24.824403](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-dpo/blob/main/results_2023-10-21T15-24-24.824403.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06763842281879194,\n\
\ \"em_stderr\": 0.0025717489509556085,\n \"f1\": 0.13085570469798627,\n\
\ \"f1_stderr\": 0.0028825856446422905,\n \"acc\": 0.39549166962367155,\n\
\ \"acc_stderr\": 0.009921949302668327\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.06763842281879194,\n \"em_stderr\": 0.0025717489509556085,\n\
\ \"f1\": 0.13085570469798627,\n \"f1_stderr\": 0.0028825856446422905\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07354056103108415,\n \
\ \"acc_stderr\": 0.0071898357543652685\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7174427782162589,\n \"acc_stderr\": 0.012654062850971384\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T15_24_24.824403
path:
- '**/details_harness|drop|3_2023-10-21T15-24-24.824403.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T15-24-24.824403.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T15_24_24.824403
path:
- '**/details_harness|gsm8k|5_2023-10-21T15-24-24.824403.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T15-24-24.824403.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T15_24_24.824403
path:
- '**/details_harness|winogrande|5_2023-10-21T15-24-24.824403.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T15-24-24.824403.parquet'
- config_name: results
data_files:
- split: 2023_10_21T15_24_24.824403
path:
- results_2023-10-21T15-24-24.824403.parquet
- split: latest
path:
- results_2023-10-21T15-24-24.824403.parquet
---
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-dpo
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-dpo
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-dpo](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T15:24:24.824403](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-dpo/blob/main/results_2023-10-21T15-24-24.824403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06763842281879194,
"em_stderr": 0.0025717489509556085,
"f1": 0.13085570469798627,
"f1_stderr": 0.0028825856446422905,
"acc": 0.39549166962367155,
"acc_stderr": 0.009921949302668327
},
"harness|drop|3": {
"em": 0.06763842281879194,
"em_stderr": 0.0025717489509556085,
"f1": 0.13085570469798627,
"f1_stderr": 0.0028825856446422905
},
"harness|gsm8k|5": {
"acc": 0.07354056103108415,
"acc_stderr": 0.0071898357543652685
},
"harness|winogrande|5": {
"acc": 0.7174427782162589,
"acc_stderr": 0.012654062850971384
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yukiamenta/tubaina | ---
license: openrail
---
|
Non-Residual-Prompting/C2Gen | ---
language:
- en
license:
- cc-by-sa-4.0
size_categories:
- <100K
task_categories:
- text-generation
---
# Dataset Card for Contextualized CommonGen(C2Gen)
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Initial Data Collection and Normalization](#initial-cata-collection-and-normalization)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Repository:** [Non-Residual Prompting](https://github.com/FreddeFrallan/Non-Residual-Prompting)
- **Paper:** [Fine-Grained Controllable Text Generation Using Non-Residual Prompting](https://aclanthology.org/2022.acl-long.471)
- **Point of Contact:** [Fredrik Carlsson](mailto:Fredrik.Carlsson@ri.se)
### Dataset Summary
CommonGen [Lin et al., 2020](https://arxiv.org/abs/1911.03705) is a dataset for the constrained text generation task of word inclusion. But the task does not allow to include context. Therefore, to complement CommonGen, we provide an extended test set C2Gen [Carlsson et al., 2022](https://aclanthology.org/2022.acl-long.471) where an additional context is provided for each set of target words. The task is therefore reformulated to both generate commonsensical text which include the given words, and also have the generated text adhere to the given context.
### Languages
English
## Dataset Structure
### Data Instances
{"Context": "The show came on the television with people singing. The family all gathered to watch. They all became silent when the show came on.", "Words": ["follow", "series", "voice"]}
### Data Fields
- context: the generated text by the model should adhere to this text
- words: the words that should be included in the generated continuation
### Data Splits
Test
## Dataset Creation
### Curation Rationale
C2Gen was created because the authors of the paper believed that the task formulation of CommonGen is too narrow, and that it needlessly incentivizes researchers
to focus on methods that do not support context. Which is orthogonal to their belief that many application areas necessitates the consideration of surrounding context. Therefore, to complement CommonGen, they provide an extended test set where an additional context is provided for each set of target words.
### Initial Data Collection and Normalization
The dataset was constructed with the help the crowd sourcing platform MechanicalTurk. Each remaining concept set manually received a textual context. To assure the quality of the data generation, only native English speakers with a recorded high acceptance were allowed to participate. Finally, all contexts were manually verified, and fixed in terms of typos and poor quality. Furthermore we want to raise awareness that C2GEN can contain personal data or offensive content. If you would encounter such a sample, please reach out to us.
## Licensing Information
license: cc-by-sa-4.0
|
AlekseyKorshuk/guanaco-english-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 126814007
num_examples: 216541
download_size: 66350200
dataset_size: 126814007
---
# Dataset Card for "guanaco-english-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Chandrasrishti/pdf_chatbot_book4 | ---
license: apache-2.0
---
|
suanlixianren/sovits3.0_32k_mirror | ---
license: mit
---
|
open-llm-leaderboard/details_automerger__Experiment27Neuralsirkrishna-7B | ---
pretty_name: Evaluation run of automerger/Experiment27Neuralsirkrishna-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [automerger/Experiment27Neuralsirkrishna-7B](https://huggingface.co/automerger/Experiment27Neuralsirkrishna-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_automerger__Experiment27Neuralsirkrishna-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T19:51:50.411484](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__Experiment27Neuralsirkrishna-7B/blob/main/results_2024-03-11T19-51-50.411484.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6525135225054541,\n\
\ \"acc_stderr\": 0.03207449149012873,\n \"acc_norm\": 0.6518414876767434,\n\
\ \"acc_norm_stderr\": 0.03274696054605214,\n \"mc1\": 0.616891064871481,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.7739528869601758,\n\
\ \"mc2_stderr\": 0.013774958307913162\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136438\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7142003584943238,\n\
\ \"acc_stderr\": 0.004508710891053863,\n \"acc_norm\": 0.8903604859589723,\n\
\ \"acc_norm_stderr\": 0.003118013608669293\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4324022346368715,\n\
\ \"acc_stderr\": 0.01656897123354861,\n \"acc_norm\": 0.4324022346368715,\n\
\ \"acc_norm_stderr\": 0.01656897123354861\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.616891064871481,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.7739528869601758,\n\
\ \"mc2_stderr\": 0.013774958307913162\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.01269693010656291\n }\n}\n```"
repo_url: https://huggingface.co/automerger/Experiment27Neuralsirkrishna-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-51-50.411484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-51-50.411484.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- '**/details_harness|winogrande|5_2024-03-11T19-51-50.411484.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T19-51-50.411484.parquet'
- config_name: results
data_files:
- split: 2024_03_11T19_51_50.411484
path:
- results_2024-03-11T19-51-50.411484.parquet
- split: latest
path:
- results_2024-03-11T19-51-50.411484.parquet
---
# Dataset Card for Evaluation run of automerger/Experiment27Neuralsirkrishna-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [automerger/Experiment27Neuralsirkrishna-7B](https://huggingface.co/automerger/Experiment27Neuralsirkrishna-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_automerger__Experiment27Neuralsirkrishna-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T19:51:50.411484](https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__Experiment27Neuralsirkrishna-7B/blob/main/results_2024-03-11T19-51-50.411484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6525135225054541,
"acc_stderr": 0.03207449149012873,
"acc_norm": 0.6518414876767434,
"acc_norm_stderr": 0.03274696054605214,
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.7739528869601758,
"mc2_stderr": 0.013774958307913162
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136438
},
"harness|hellaswag|10": {
"acc": 0.7142003584943238,
"acc_stderr": 0.004508710891053863,
"acc_norm": 0.8903604859589723,
"acc_norm_stderr": 0.003118013608669293
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4324022346368715,
"acc_stderr": 0.01656897123354861,
"acc_norm": 0.4324022346368715,
"acc_norm_stderr": 0.01656897123354861
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.616891064871481,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.7739528869601758,
"mc2_stderr": 0.013774958307913162
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571778
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.01269693010656291
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
creative-graphic-design/PubLayNet | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- en
license:
- cdla-permissive-1.0
multilinguality:
- monolingual
size_categories: []
source_datasets:
- original
task_categories:
- image-classification
- image-segmentation
- image-to-text
- question-answering
- other
- multiple-choice
- token-classification
- tabular-to-text
- object-detection
- table-question-answering
- text-classification
- table-to-text
task_ids:
- multi-label-image-classification
- multi-class-image-classification
- semantic-segmentation
- image-captioning
- extractive-qa
- closed-domain-qa
- multiple-choice-qa
- named-entity-recognition
pretty_name: PubLayNet
tags:
- graphic design
- layout-generation
dataset_info:
features:
- name: image_id
dtype: int32
- name: file_name
dtype: string
- name: width
dtype: int32
- name: height
dtype: int32
- name: image
dtype: image
- name: annotations
sequence:
- name: annotation_id
dtype: int32
- name: area
dtype: float32
- name: bbox
sequence: float32
length: 4
- name: category
struct:
- name: category_id
dtype: int32
- name: name
dtype:
class_label:
names:
'0': text
'1': title
'2': list
'3': table
'4': figure
- name: supercategory
dtype: string
- name: category_id
dtype: int32
- name: image_id
dtype: int32
- name: iscrowd
dtype: bool
- name: segmentation
dtype: image
splits:
- name: train
num_bytes: 99127922734.771
num_examples: 335703
- name: validation
num_bytes: 3513203604.885
num_examples: 11245
- name: test
num_bytes: 3406081626.495
num_examples: 11405
download_size: 107597638930
dataset_size: 106047207966.15099
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for PubLayNet
[](https://github.com/shunk031/huggingface-datasets_PubLayNet/actions/workflows/ci.yaml)
## Table of Contents
- [Dataset Card Creation Guide](#dataset-card-creation-guide)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://developer.ibm.com/exchanges/data/all/publaynet/
- **Repository:** https://github.com/shunk031/huggingface-datasets_PubLayNet
- **Paper (Preprint):** https://arxiv.org/abs/1908.07836
- **Paper (ICDAR2019):** https://ieeexplore.ieee.org/document/8977963
### Dataset Summary
PubLayNet is a dataset for document layout analysis. It contains images of research papers and articles and annotations for various elements in a page such as "text", "list", "figure" etc in these research paper images. The dataset was obtained by automatically matching the XML representations and the content of over 1 million PDF articles that are publicly available on PubMed Central.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
```python
import datasets as ds
dataset = ds.load_dataset(
path="shunk031/PubLayNet",
decode_rle=True, # True if Run-length Encoding (RLE) is to be decoded and converted to binary mask.
)
```
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
- [CDLA-Permissive](https://cdla.io/permissive-1-0/)
### Citation Information
```bibtex
@inproceedings{zhong2019publaynet,
title={Publaynet: largest dataset ever for document layout analysis},
author={Zhong, Xu and Tang, Jianbin and Yepes, Antonio Jimeno},
booktitle={2019 International Conference on Document Analysis and Recognition (ICDAR)},
pages={1015--1022},
year={2019},
organization={IEEE}
}
```
### Contributions
Thanks to [ibm-aur-nlp/PubLayNet](https://github.com/ibm-aur-nlp/PubLayNet) for creating this dataset.
|
euswam/SuamPk | ---
license: cc-by-3.0
---
|
David-Xu/astronomy-stack-cira | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_question
dtype: string
- name: score_chosen
dtype: string
- name: score_rejected
dtype: string
splits:
- name: train
num_bytes: 62648084
num_examples: 19935
download_size: 15411984
dataset_size: 62648084
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CVasNLPExperiments/TinyImagenet_2k_validation_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_2000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 839095
num_examples: 2000
download_size: 216830
dataset_size: 839095
---
# Dataset Card for "TinyImagenet_2k_validation_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_2000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_75 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23301532944.625
num_examples: 242603
download_size: 21413215455
dataset_size: 23301532944.625
---
# Dataset Card for "chunk_75"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/florence-the-machine | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/florence-the-machine"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.269066 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/99d09eb55276442d715ac14f06173a4e.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/florence-the-machine">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Florence + The Machine</div>
<a href="https://genius.com/artists/florence-the-machine">
<div style="text-align: center; font-size: 14px;">@florence-the-machine</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/florence-the-machine).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/florence-the-machine")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|173| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/florence-the-machine")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
ashleybishop/tomi_nil_processed | ---
dataset_info:
features:
- name: label
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2261110
num_examples: 5994
- name: validation
num_bytes: 2264924
num_examples: 5994
- name: test
num_bytes: 2255563
num_examples: 5994
download_size: 818461
dataset_size: 6781597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
shortbread/tickers | ---
language:
- en
tags:
- finance
size_categories:
- 1K<n<10K
last_updated:
2023-07-20
---
Tickers
=======
|
hson04/testData | ---
dataset_info:
features:
- name: id_EXIST
dtype: int64
- name: lang
dtype: string
- name: text
dtype: string
- name: number_annotators
dtype: int64
- name: annotators
sequence: string
- name: gender_annotators
sequence: string
- name: age_annotators
sequence: string
- name: ethnicities_annotators
sequence: string
- name: study_levels_annotators
sequence: string
- name: countries_annotators
sequence: string
- name: labels_task1
sequence: string
- name: labels_task2
sequence: string
- name: labels_task3
sequence:
sequence: string
- name: split
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7121632
num_examples: 6920
download_size: 1175271
dataset_size: 7121632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rikdas/madras_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 22751754.0
num_examples: 10
download_size: 22753302
dataset_size: 22751754.0
---
# Dataset Card for "madras_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
javismiles/lora3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 23169.0
num_examples: 3
download_size: 23782
dataset_size: 23169.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pane2k/paneModel | ---
license: mit
---
|
Deivid457/Rengoku | ---
license: openrail
---
|
alphalm/gt1_8kElo_all_tokenized | ---
license: apache-2.0
---
|
indicbench/arc_hi | ---
dataset_info:
- config_name: ARC-Challenge
features:
- name: answerKey
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: id
dtype: string
- name: question
dtype: string
splits:
- name: validation
num_bytes: 215532
num_examples: 299
- name: test
num_bytes: 839210
num_examples: 1172
download_size: 396941
dataset_size: 1054742
- config_name: default
features:
- name: _data_files
list:
- name: filename
dtype: string
- name: _fingerprint
dtype: string
- name: _format_columns
dtype: 'null'
- name: _format_type
dtype: 'null'
- name: _output_all_columns
dtype: bool
- name: _split
dtype: 'null'
splits:
- name: validation
num_bytes: 54
num_examples: 1
- name: test
num_bytes: 54
num_examples: 1
download_size: 6510
dataset_size: 108
configs:
- config_name: ARC-Challenge
data_files:
- split: validation
path: ARC-Challenge/validation-*
- split: test
path: ARC-Challenge/test-*
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
benayas/snips_llm_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 7164970
num_examples: 13084
- name: test
num_bytes: 768070
num_examples: 1400
download_size: 900698
dataset_size: 7933040
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
arieg/cluster_cls | ---
dataset_info:
features:
- name: image
dtype: image
- name: track_id
dtype:
class_label:
names:
'0': '000002'
'1': '000005'
'2': '000010'
'3': '000140'
'4': '000141'
'5': 000148
'6': 000182
'7': 000190
'8': 000193
'9': 000194
'10': 000197
'11': '000200'
'12': '000203'
'13': '000204'
'14': '000207'
'15': '000210'
'16': '000211'
'17': '000212'
'18': '000213'
'19': '000255'
'20': '000256'
'21': 000368
'22': '000424'
'23': 000459
'24': '000534'
'25': '000540'
'26': '000546'
'27': '000574'
'28': '000602'
'29': '000615'
'30': '000620'
'31': '000621'
'32': '000625'
'33': '000666'
'34': '000667'
'35': '000676'
'36': 000690
'37': 000694
'38': 000695
'39': '000704'
'40': '000705'
'41': '000706'
'42': '000707'
'43': 000708
'44': 000709
'45': '000714'
'46': '000715'
'47': '000716'
'48': 000718
'49': '000777'
'50': 000814
'51': 000821
'52': 000822
'53': 000825
'54': 000853
'55': 000890
'56': 000892
'57': 000897
'58': 000993
'59': 000995
'60': 000997
'61': 000998
'62': 001039
'63': '001040'
'64': '001066'
'65': 001069
'66': '001073'
'67': '001075'
'68': 001082
'69': 001083
'70': 001087
'71': '001102'
'72': 001193
'73': 001195
'74': 001196
'75': 001197
'76': 001249
'77': 001259
'78': '001270'
'79': '001276'
'80': '001277'
'81': 001278
'82': '001417'
'83': '001427'
'84': '001443'
'85': 001482
'86': '001510'
'87': '001544'
'88': '001642'
'89': '001644'
'90': 001649
'91': '001661'
'92': '001663'
'93': '001666'
'94': '001673'
'95': 001680
'96': 001681
'97': 001682
'98': 001683
'99': 001684
'100': 001685
'101': 001686
'102': 001687
'103': 001688
'104': 001689
'105': '001701'
'106': '001702'
'107': '001703'
'108': '001704'
'109': '001706'
'110': '001720'
'111': '001732'
'112': '001733'
'113': '001735'
'114': '001736'
'115': 001883
'116': 001891
'117': 001893
'118': 001924
'119': 001925
'120': 001929
'121': 001930
'122': '002012'
'123': 002096
'124': 002097
'125': 002099
'126': '003263'
'127': '003264'
'128': '003265'
'129': '003266'
'130': '003270'
'131': '003271'
'132': '003272'
'133': '003273'
'134': '003274'
'135': 003492
'136': '003532'
'137': '003533'
'138': '003534'
'139': '003535'
'140': '003537'
'141': 003538
'142': '003573'
'143': 003598
'144': '003624'
'145': '003707'
'146': 003708
'147': '003720'
'148': '003721'
'149': '003722'
'150': '003724'
'151': '003725'
'152': '003761'
'153': '003762'
'154': '003763'
'155': '003765'
'156': '003766'
'157': '003775'
'158': '003776'
'159': '003777'
'160': 003778
'161': 003779
'162': 003832
'163': 003833
'164': 003840
'165': 003880
'166': 003895
'167': 003896
'168': 003904
'169': 003905
'170': 003906
'171': 003908
'172': 003909
'173': 003910
'174': 003911
'175': 003912
'176': 003913
'177': 003920
'178': 003921
'179': 003950
'180': '004013'
'181': '004017'
'182': '004022'
'183': '004037'
'184': '004066'
'185': '004067'
'186': 004068
'187': 004069
'188': '004070'
'189': '004071'
'190': '004072'
'191': '004073'
'192': '004074'
'193': '004075'
'194': '004076'
'195': '004077'
'196': 004078
'197': 004079
'198': 004080
'199': 004091
'200': 004092
'201': 004093
'202': 004094
'203': 004095
'204': 004096
'205': 004097
'206': 004098
'207': 004099
'208': '004100'
'209': '004101'
'210': '004102'
'211': '004103'
'212': 004108
'213': '004232'
'214': '004233'
'215': '004234'
'216': '004235'
'217': '004236'
'218': 004239
'219': '004450'
'220': '004507'
'221': 004508
'222': 004509
'223': '004510'
'224': '004511'
'225': 004519
'226': '004520'
'227': '004521'
'228': '004522'
'229': 004682
'230': 004684
'231': 004685
'232': 004688
'233': '004777'
'234': 004778
'235': 004779
'236': 004780
'237': 004781
'238': 004782
'239': 004784
'240': 004785
'241': 004786
'242': 004787
'243': 004788
'244': 004799
'245': 004835
'246': 004836
'247': 004838
'248': 004846
'249': 004848
'250': 004849
'251': '005006'
'252': '005156'
'253': '005157'
'254': 005158
'255': 005159
'256': 005169
'257': '005170'
'258': '005171'
'259': 005191
'260': '005264'
'261': 005268
'262': '005376'
'263': 005381
'264': '005521'
'265': 005879
'266': 005936
'267': 005940
'268': 006329
'269': '006330'
'270': '006331'
'271': '006332'
'272': '006333'
'273': '006342'
'274': '006354'
'275': '006357'
'276': 006358
'277': '006360'
'278': '006363'
'279': '006366'
'280': '006367'
'281': 006368
'282': '006370'
'283': '006372'
'284': '006373'
'285': '006376'
'286': 006379
'287': 006380
'288': 006381
'289': 006382
'290': 006383
'291': 006385
'292': 006387
'293': 006389
'294': 006390
'295': 006393
'296': 006394
'297': 006396
'298': '006406'
'299': '006407'
'300': 006439
'301': '006440'
'302': '006442'
'303': '006443'
'304': 006448
'305': 006459
'306': '006461'
'307': '006463'
'308': '006467'
'309': 006469
'310': '006517'
'311': 006519
'312': '006603'
'313': '006605'
'314': '006606'
'315': '006607'
'316': 006608
'317': 006609
'318': '006610'
'319': '006611'
'320': '006674'
'321': '006675'
'322': '006677'
'323': 006679
'324': 006680
'325': 006684
'326': '006762'
'327': '006776'
'328': 006778
'329': 006779
'330': 006782
'331': 006783
'332': 006788
'333': 006802
'334': 006803
'335': 006854
'336': 006855
'337': 006856
'338': 006857
'339': '007011'
'340': '007373'
'341': '007374'
'342': '007375'
'343': '007376'
'344': '007377'
'345': 007378
'346': 007379
'347': 007381
'348': 007383
'349': 007385
'350': 007386
'351': 007388
'352': 007391
'353': 007393
'354': 007481
'355': 007482
'356': 007483
'357': 007487
'358': 007488
'359': 007489
'360': 007490
'361': 007491
'362': 007492
'363': 007495
'364': '007526'
'365': '007527'
'366': 007528
'367': 007529
'368': 007548
'369': '007554'
'370': 007709
'371': '007710'
'372': '007711'
'373': '007712'
'374': '007713'
'375': 007872
'376': 008056
'377': 008208
'378': 008256
'379': 008259
'380': 008261
'381': 008345
'382': 008357
'383': 008363
'384': 008372
'385': 008416
'386': 009152
'387': 009155
'388': 009307
'389': 009476
'390': 009477
'391': 009491
'392': 009505
'393': 009511
'394': 009512
'395': 009513
'396': 009550
'397': 009553
'398': 009555
'399': 009557
'400': 009559
'401': 009560
'402': 009678
'403': 009721
'404': 009846
'405': 009887
'406': 009888
'407': 009918
'408': 009962
'409': 010186
'410': 010192
'411': '010250'
'412': '010374'
'413': '010375'
'414': '010376'
'415': '010377'
'416': 010381
'417': 010382
'418': 010383
'419': 010384
'420': 010385
'421': 010386
'422': 010387
'423': 010388
'424': 010389
'425': '010435'
'426': 010438
'427': 010439
'428': '010440'
'429': '010441'
'430': '010442'
'431': '010443'
'432': '010444'
'433': '010447'
'434': 010458
'435': 010480
'436': 010481
'437': 010485
'438': '010521'
'439': '010527'
'440': '010535'
'441': '010541'
'442': '010575'
'443': '010577'
'444': 010668
'445': 010669
'446': '010670'
'447': '010671'
'448': '010672'
'449': '010673'
'450': '010674'
'451': '010675'
'452': '010676'
'453': '010677'
'454': 010678
'455': 010679
'456': 010682
'457': 010684
'458': 010693
'459': 010694
'460': 010695
'461': 010696
'462': 010697
'463': 010698
'464': 010699
'465': 010805
'466': 010806
'467': 010807
'468': 010808
'469': 010809
'470': 010810
'471': 010983
'472': 010992
'473': 010993
'474': 011019
'475': '011020'
'476': 011059
'477': 011198
'478': 011199
'479': '011200'
'480': '011204'
'481': '011206'
'482': '011234'
'483': '011237'
'484': 011239
'485': '011242'
'486': '011261'
'487': '011262'
'488': '011264'
'489': 011268
'490': 011298
'491': 011299
'492': '011306'
'493': '011333'
'494': '011334'
'495': '011503'
'496': '011504'
'497': '011505'
'498': 011508
'499': '011544'
'500': 011638
'501': '011671'
'502': '011672'
'503': '011673'
'504': '011674'
'505': '011675'
'506': '011677'
'507': 011679
'508': 011681
'509': 011682
'510': 011683
'511': '011763'
'512': '011764'
'513': '011765'
'514': '011766'
'515': '011767'
'516': 011768
'517': 011769
'518': '011770'
'519': '011771'
'520': '011772'
'521': '011773'
'522': '011774'
'523': '011775'
'524': '011776'
'525': '011777'
'526': 011778
'527': 011779
'528': 011780
'529': 011781
'530': 011782
'531': 011783
'532': 011784
'533': 011785
'534': 011786
'535': 011787
'536': 011788
'537': 011789
'538': 011790
'539': 011791
'540': 011792
'541': 011793
'542': 011794
'543': 011795
'544': 011803
'545': 011818
'546': 011839
'547': 011861
'548': 011862
'549': 011867
'550': 011868
'551': 011916
'552': 011917
'553': 011918
'554': 011919
'555': 011920
'556': 011921
'557': 011922
'558': 011933
'559': 011937
'560': 011942
'561': 011946
'562': 011947
'563': 011951
'564': '012045'
'565': '012046'
'566': '012047'
'567': 012048
'568': 012049
'569': '012050'
'570': '012051'
'571': '012052'
'572': '012053'
'573': 012058
'574': 012059
'575': '012060'
'576': '012061'
'577': '012062'
'578': '012064'
'579': '012065'
'580': '012066'
'581': '012067'
'582': 012109
'583': '012146'
'584': '012147'
'585': '012173'
'586': '012174'
'587': 012179
'588': 012188
'589': 012189
'590': '012346'
'591': 012348
'592': 012349
'593': '012350'
'594': '012351'
'595': '012352'
'596': '012353'
'597': '012355'
'598': '012376'
'599': 012387
'600': 012390
'601': 012394
'602': 012481
'603': 012482
'604': 012484
'605': 012485
'606': 012486
'607': 012487
'608': 012488
'609': 012489
'610': 012490
'611': 012508
'612': '012513'
'613': '012514'
'614': 012518
'615': '012521'
'616': '012526'
'617': '012527'
'618': '012530'
'619': '012531'
'620': '012532'
'621': '012537'
'622': '012551'
'623': '012552'
'624': '012654'
'625': 012690
'626': 012691
'627': 012692
'628': '012737'
'629': 012985
'630': 012986
'631': 013191
'632': 013197
'633': 013199
'634': '013201'
'635': 013218
'636': '013220'
'637': '013325'
'638': 013328
'639': '013362'
'640': 013378
'641': '013474'
'642': '013537'
'643': 013538
'644': 013539
'645': '013540'
'646': '013556'
'647': '013561'
'648': '013562'
'649': '013566'
'650': '013571'
'651': 013578
'652': 013591
'653': 013596
'654': '013666'
'655': 013668
'656': '013670'
'657': '013706'
'658': '013707'
'659': 013708
'660': 013709
'661': '013710'
'662': '013711'
'663': '013735'
'664': '013747'
'665': 013748
'666': 013749
'667': '013767'
'668': 013768
'669': 013804
'670': 013927
'671': 013928
'672': 013929
'673': 013930
'674': '014063'
'675': 014208
'676': '014315'
'677': '014316'
'678': '014317'
'679': 014318
'680': 014319
'681': '014320'
'682': '014344'
'683': 014358
'684': '014363'
'685': '014365'
'686': 014386
'687': 014391
'688': 014538
'689': 014539
'690': '014541'
'691': '014542'
'692': 014568
'693': 014569
'694': '014570'
'695': '014571'
'696': '014572'
'697': '014576'
'698': '014577'
'699': 014578
'700': 014579
'701': 014580
'702': 014581
'703': 014583
'704': 014584
'705': 014585
'706': 014586
'707': 014588
'708': 014589
'709': 014590
'710': '014601'
'711': '014602'
'712': '014603'
'713': '014604'
'714': '014653'
'715': '014661'
'716': '014663'
'717': 014684
'718': 014690
'719': 014693
'720': '014733'
'721': '014734'
'722': '014735'
'723': '014736'
'724': '014737'
'725': 014738
'726': 014739
'727': '014740'
'728': '014741'
'729': '014742'
'730': '014743'
'731': '014744'
'732': '014745'
'733': 014809
'734': 014869
'735': 015094
'736': '015210'
'737': '015464'
'738': 015469
'739': '015471'
'740': '015475'
'741': '015476'
'742': 015487
'743': 015488
'744': '015540'
'745': '015541'
'746': '015542'
'747': '015543'
'748': '015625'
'749': 015769
'750': '015770'
'751': '015771'
'752': '015772'
'753': '015773'
'754': 015880
'755': 016095
'756': '016155'
'757': 016158
'758': '016162'
'759': '016163'
'760': '016334'
'761': '016337'
'762': 016338
'763': 016339
'764': '016340'
'765': '016354'
'766': '016743'
'767': '016744'
'768': '016745'
'769': '016747'
'770': 016819
'771': 016820
'772': 016821
'773': 016822
'774': 016878
'775': 016879
'776': 016880
'777': 016895
'778': 016994
'779': 016995
'780': 016997
'781': '017132'
'782': '017344'
'783': '017345'
'784': '017462'
'785': 017491
'786': 017496
'787': 017499
'788': '017500'
'789': '017573'
'790': 017588
'791': '017605'
'792': '017606'
'793': '017607'
'794': 017608
'795': 017609
'796': '017610'
'797': '017611'
'798': '017631'
'799': '017632'
'800': '017633'
'801': '017634'
'802': '017635'
'803': '017636'
'804': '017637'
'805': '017644'
'806': '017735'
'807': 017782
'808': 017884
'809': 017906
'810': 018031
'811': 018032
'812': 018033
'813': 018034
'814': 018037
'815': 018038
'816': 018039
'817': 018043
'818': 018044
'819': 018112
'820': 018124
'821': 018144
'822': 018145
'823': 018146
'824': 018159
'825': 018197
'826': 018350
'827': 018607
'828': 018611
'829': 018876
'830': 018877
'831': 018887
'832': 019073
'833': 019074
'834': 019179
'835': 019184
'836': 019187
'837': 019192
'838': 019412
'839': 019413
'840': 019415
'841': 019416
'842': 019417
'843': 019418
'844': 019420
'845': 019422
'846': 019423
'847': 019425
'848': 019438
'849': 019439
'850': 019441
'851': 019442
'852': 019459
'853': 019673
'854': 019674
'855': 019685
'856': 019689
'857': 019707
'858': 019708
'859': 019729
'860': 019758
'861': 019759
'862': 019760
'863': 019889
'864': 019890
'865': 019891
'866': '020050'
'867': 020296
'868': '020361'
'869': '020362'
'870': '020364'
'871': '020365'
'872': '020366'
'873': 020369
'874': '020372'
'875': '020373'
'876': '020374'
'877': '020375'
'878': '020376'
'879': '020424'
'880': '020432'
'881': 020469
'882': '020667'
'883': '020704'
'884': 020818
'885': 021058
'886': 021085
'887': 021087
'888': '021167'
'889': 021228
'890': '021231'
'891': '021232'
'892': '021400'
'893': '021401'
'894': '021402'
'895': '021403'
'896': '021404'
'897': 021409
'898': '021422'
'899': '021565'
'900': 021587
'901': '021657'
'902': '021672'
'903': '021676'
'904': '021677'
'905': '021707'
'906': '021774'
'907': 021842
'908': 021859
'909': 021860
'910': 021891
'911': 021895
'912': 021995
'913': 021996
'914': 021997
'915': 021998
'916': 021999
'917': '022000'
'918': '022001'
'919': 022088
'920': 022091
'921': 022093
'922': 022094
'923': 022095
'924': 022097
'925': '022150'
'926': 022295
'927': 022296
'928': '022315'
'929': 022348
'930': '022472'
'931': '022473'
'932': '022474'
'933': '022475'
'934': '022476'
'935': '022477'
'936': 022478
'937': 022479
'938': 022480
'939': 022481
'940': '023010'
'941': '023013'
'942': '023014'
'943': '023015'
'944': '023016'
'945': '023037'
'946': 023039
'947': '023041'
'948': '023063'
'949': '023155'
'950': '023156'
'951': '023172'
'952': 023329
'953': '023353'
'954': '023355'
'955': '023371'
'956': '023372'
'957': '023505'
'958': 023862
'959': '024216'
'960': '024217'
'961': 024218
'962': '024362'
'963': '024363'
'964': '024364'
'965': '024365'
'966': '024366'
'967': '024367'
'968': 024368
'969': 024369
'970': '024370'
'971': '024371'
'972': 024418
'973': '024420'
'974': '024421'
'975': '024422'
'976': '024423'
'977': '024424'
'978': '024425'
'979': '024426'
'980': '024427'
'981': 024428
'982': 024429
'983': '024430'
'984': '024431'
'985': '024432'
'986': '024512'
'987': '024515'
'988': '024521'
'989': '024524'
'990': 024698
'991': 024699
'992': '024700'
'993': '024701'
'994': '024702'
'995': '024717'
'996': '024720'
'997': 024739
'998': '024741'
'999': '024742'
'1000': '024745'
'1001': '024746'
'1002': '024747'
'1003': 024748
'1004': 024749
'1005': 024842
'1006': 024898
'1007': 024899
'1008': 024901
'1009': 024912
'1010': 024915
'1011': 024917
'1012': 024963
'1013': 024975
'1014': 024983
'1015': 025028
'1016': 025029
'1017': '025030'
'1018': '025031'
'1019': '025032'
'1020': '025033'
'1021': '025055'
'1022': '025063'
'1023': '025066'
'1024': '025104'
'1025': '025124'
'1026': '025215'
'1027': '025216'
'1028': '025227'
'1029': '025232'
'1030': '025233'
'1031': '025234'
'1032': '025235'
'1033': '025324'
'1034': 025378
'1035': '025601'
'1036': '025603'
'1037': '025605'
'1038': '025606'
'1039': 025608
'1040': 025609
'1041': 025668
'1042': 025669
'1043': '025670'
'1044': 025795
'1045': 025796
'1046': 025797
'1047': 025802
'1048': 025804
'1049': '026007'
'1050': 026008
'1051': '026010'
'1052': '026011'
'1053': '026012'
'1054': '026013'
'1055': '026014'
'1056': '026016'
'1057': '026017'
'1058': '026020'
'1059': '026021'
'1060': '026022'
'1061': '026025'
'1062': '026026'
'1063': '026034'
'1064': '026035'
'1065': '026036'
'1066': 026169
'1067': '026174'
'1068': 026298
'1069': '026301'
'1070': '026302'
'1071': '026307'
'1072': '026322'
'1073': '026464'
'1074': '026465'
'1075': '026466'
'1076': 026583
'1077': '026600'
'1078': '026605'
'1079': 026629
'1080': 026638
'1081': 026639
'1082': '026640'
'1083': '026641'
'1084': '026642'
'1085': '026643'
'1086': '026651'
'1087': '026652'
'1088': '026653'
'1089': '026654'
'1090': '026655'
'1091': '026656'
'1092': '026657'
'1093': 026658
'1094': 026659
'1095': '026674'
'1096': 026681
'1097': '026754'
'1098': '026765'
'1099': 026859
'1100': 026861
'1101': 026902
'1102': 026904
'1103': 026905
'1104': 026906
'1105': '027164'
'1106': '027177'
'1107': 027194
'1108': 027195
'1109': 027197
'1110': 027198
'1111': 027258
'1112': '027406'
'1113': '027454'
'1114': '027455'
'1115': '027456'
'1116': '027547'
'1117': 027548
'1118': 027549
'1119': '027550'
'1120': '027551'
'1121': '027552'
'1122': 027609
'1123': '027610'
'1124': '027611'
'1125': '027612'
'1126': '027613'
'1127': '027667'
'1128': '027673'
'1129': 027797
'1130': 027798
'1131': 027799
'1132': 027802
'1133': 027803
'1134': 027804
'1135': 027805
'1136': 027855
'1137': 027856
'1138': 027866
'1139': 027945
'1140': 027953
'1141': 027975
'1142': 027978
'1143': 027981
'1144': 027987
'1145': 028070
'1146': 028072
'1147': 028179
'1148': 028241
'1149': 028260
'1150': 028266
'1151': 028274
'1152': 028375
'1153': 028376
'1154': 028477
'1155': 028478
'1156': 028479
'1157': 028480
'1158': 028481
'1159': 028482
'1160': 028483
'1161': 028484
'1162': 028485
'1163': 028546
'1164': 028548
'1165': 028553
'1166': 028571
'1167': 028608
'1168': 028692
'1169': 028802
'1170': 029037
'1171': 029039
'1172': 029040
'1173': 029041
'1174': 029042
'1175': 029043
'1176': 029044
'1177': 029045
'1178': 029128
'1179': 029180
'1180': 029243
'1181': 029245
'1182': 029255
'1183': 029271
'1184': 029272
'1185': 029350
'1186': 029351
'1187': 029355
'1188': 029465
'1189': 029480
'1190': 029526
'1191': 029528
'1192': 029530
'1193': 029587
'1194': 029602
'1195': 029673
'1196': 029718
'1197': 029719
'1198': 029720
'1199': 029721
'1200': 029738
'1201': 029739
'1202': 029740
'1203': 029741
'1204': 029742
'1205': 029744
'1206': 029745
'1207': 029746
'1208': 029747
'1209': 029750
'1210': 029752
'1211': 029807
'1212': 029813
'1213': 029816
'1214': 029961
'1215': 029971
'1216': '030041'
'1217': '030043'
'1218': '030050'
'1219': '030056'
'1220': 030058
'1221': 030059
'1222': 030090
'1223': 030095
'1224': '030120'
'1225': 030196
'1226': 030198
'1227': '030230'
'1228': '030316'
'1229': 030486
'1230': 030487
'1231': 030488
'1232': 030519
'1233': '030520'
'1234': '030521'
'1235': '030522'
'1236': '030636'
'1237': 030682
'1238': 030690
'1239': '030702'
'1240': '030740'
'1241': 030895
'1242': '031040'
'1243': '031041'
'1244': '031042'
'1245': '031043'
'1246': '031044'
'1247': '031165'
'1248': '031356'
'1249': 031389
'1250': 031390
'1251': 031391
'1252': 031392
'1253': 031568
'1254': 031807
'1255': 031887
'1256': 031888
'1257': 031889
'1258': 031999
'1259': '032001'
'1260': '032021'
'1261': '032075'
'1262': 032081
'1263': 032218
'1264': '032325'
'1265': '032326'
'1266': '032327'
'1267': 032328
'1268': 032329
'1269': '032330'
'1270': '032331'
'1271': '032332'
'1272': '032333'
'1273': '032334'
'1274': '032335'
'1275': '032336'
'1276': '032337'
'1277': 032338
'1278': 032339
'1279': '032340'
'1280': '032433'
'1281': '032435'
'1282': '032437'
'1283': 032438
'1284': 032439
'1285': '032525'
'1286': 032686
'1287': 032687
'1288': 032689
'1289': 032693
'1290': 032694
'1291': 032695
'1292': '032755'
'1293': '032756'
'1294': 032759
'1295': '032760'
'1296': 032800
'1297': 032882
'1298': '033020'
'1299': 033049
'1300': '033050'
'1301': '033064'
'1302': '033067'
'1303': 033068
'1304': 033069
'1305': '033070'
'1306': '033071'
'1307': '033072'
'1308': '033123'
'1309': '033124'
'1310': '033203'
'1311': '033216'
'1312': '033221'
'1313': 033278
'1314': '033415'
'1315': '033422'
'1316': '033424'
'1317': '033426'
'1318': '033446'
'1319': 033459
'1320': '033460'
'1321': '033461'
'1322': '033465'
'1323': '033477'
'1324': 033486
'1325': 033538
'1326': 033992
'1327': '034003'
'1328': '034147'
'1329': '034167'
'1330': '034257'
'1331': 034258
'1332': '034263'
'1333': 034484
'1334': '034510'
'1335': '034511'
'1336': 034994
'1337': 034996
'1338': '035007'
'1339': 035008
'1340': 035182
'1341': 035184
'1342': 035198
'1343': 035199
'1344': '035204'
'1345': 035296
'1346': 035299
'1347': '035443'
'1348': '035444'
'1349': '035462'
'1350': '035527'
'1351': '035534'
'1352': '035535'
'1353': '035537'
'1354': 035539
'1355': '035541'
'1356': '035543'
'1357': '035544'
'1358': '035545'
'1359': 035549
'1360': '035550'
'1361': 035569
'1362': '035571'
'1363': 035608
'1364': '035734'
'1365': 036096
'1366': 036097
'1367': 036099
'1368': '036143'
'1369': '036144'
'1370': '036145'
'1371': '036146'
'1372': '036147'
'1373': '036245'
'1374': '036257'
'1375': 036258
'1376': '036261'
'1377': '036272'
'1378': '036273'
'1379': '036275'
'1380': '036277'
'1381': '036302'
'1382': '036304'
'1383': '036322'
'1384': '036333'
'1385': '036371'
'1386': 036380
'1387': 036388
'1388': 036428
'1389': '036435'
'1390': 036481
'1391': '036526'
'1392': '036560'
'1393': '036567'
'1394': '036614'
'1395': '036615'
'1396': '036616'
'1397': 036618
'1398': '036643'
'1399': 036659
'1400': 036799
'1401': 036959
'1402': 036961
'1403': 036965
'1404': 036966
'1405': 036983
'1406': 036984
'1407': 036985
'1408': 036986
'1409': 036987
'1410': 036988
'1411': 036990
'1412': 036992
'1413': 036994
'1414': 036997
'1415': 036998
'1416': 036999
'1417': '037041'
'1418': '037111'
'1419': '037113'
'1420': 037119
'1421': '037121'
'1422': '037131'
'1423': '037136'
'1424': '037141'
'1425': '037147'
'1426': '037324'
'1427': '037325'
'1428': 037368
'1429': 037369
'1430': '037416'
'1431': '037417'
'1432': '037423'
'1433': 037538
'1434': 037592
'1435': '037725'
'1436': '037727'
'1437': '037730'
'1438': '037731'
'1439': 037779
'1440': 037781
'1441': 037784
'1442': 037859
'1443': 037911
'1444': 037920
'1445': 038312
'1446': 038321
'1447': 038323
'1448': 038326
'1449': 038351
'1450': 038352
'1451': 038353
'1452': 038354
'1453': 038361
'1454': 038362
'1455': 038363
'1456': 038365
'1457': 038399
'1458': 038435
'1459': 038450
'1460': 038522
'1461': 038557
'1462': 038560
'1463': 038775
'1464': 038776
'1465': 038777
'1466': 038778
'1467': 038779
'1468': 038780
'1469': 038781
'1470': 038782
'1471': 038783
'1472': 038784
'1473': 038785
'1474': 038817
'1475': 038818
'1476': 038819
'1477': 038820
'1478': 038821
'1479': 038822
'1480': 038823
'1481': 038824
'1482': 038825
'1483': 038826
'1484': 038827
'1485': 038828
'1486': 038829
'1487': 038830
'1488': 038833
'1489': 038834
'1490': 038847
'1491': 038859
'1492': 038878
'1493': 038879
'1494': 038880
'1495': 038881
'1496': 038882
'1497': 038884
'1498': 038886
'1499': 038887
'1500': 038888
'1501': 038890
'1502': 038891
'1503': 038892
'1504': 038893
'1505': 038894
'1506': 038895
'1507': 038896
'1508': 038898
'1509': 038899
'1510': 038900
'1511': 038901
'1512': 038902
'1513': 038904
'1514': 038905
'1515': 038906
'1516': 038907
'1517': 038908
'1518': 038910
'1519': 038911
'1520': 038912
'1521': 038914
'1522': 038955
'1523': 038961
'1524': 038964
'1525': 038965
'1526': 038966
'1527': 038967
'1528': 039188
'1529': 039259
'1530': 039278
'1531': 039291
'1532': 039298
'1533': 039316
'1534': 039317
'1535': 039318
'1536': 039357
'1537': 039359
'1538': 039378
'1539': 039484
'1540': 039488
'1541': 039530
'1542': 039605
'1543': 039607
'1544': 039658
'1545': 039659
'1546': 039660
'1547': 039661
'1548': 039662
'1549': 039663
'1550': 039664
'1551': 039665
'1552': 039666
'1553': 039667
'1554': 039875
'1555': 039900
'1556': 039904
'1557': '040121'
'1558': '040122'
'1559': '040123'
'1560': '040133'
'1561': '040134'
'1562': 040139
'1563': '040141'
'1564': '040147'
'1565': '040161'
'1566': 040180
'1567': 040182
'1568': 040229
'1569': '040230'
'1570': '040231'
'1571': '040232'
'1572': '040233'
'1573': '040234'
'1574': '040235'
'1575': '040236'
'1576': '040237'
'1577': 040238
'1578': 040239
'1579': '040240'
'1580': '040241'
'1581': '040242'
'1582': '040243'
'1583': '040244'
'1584': '040245'
'1585': '040250'
'1586': 040509
'1587': '040525'
'1588': '040541'
'1589': '040542'
'1590': 040598
'1591': '040654'
'1592': '040655'
'1593': '040656'
'1594': '040657'
'1595': 040658
'1596': 040659
'1597': '040660'
'1598': 040683
'1599': '040725'
'1600': 040842
'1601': 040843
'1602': 040844
'1603': 040845
'1604': 040851
'1605': 040903
'1606': 040908
'1607': 040909
'1608': 040938
'1609': 040940
'1610': 040984
'1611': 040985
'1612': 040986
'1613': 041018
'1614': 041019
'1615': '041020'
'1616': '041054'
'1617': 041095
'1618': '041147'
'1619': 041191
'1620': 041192
'1621': '041310'
'1622': 041381
'1623': 041568
'1624': '041570'
'1625': '041573'
'1626': '041605'
'1627': 041709
'1628': '041714'
'1629': 041812
'1630': 041819
'1631': 041820
'1632': 041825
'1633': 041961
'1634': 041962
'1635': 041965
'1636': 041971
'1637': 041983
'1638': '042014'
'1639': '042016'
'1640': '042017'
'1641': 042018
'1642': 042019
'1643': '042020'
'1644': '042023'
'1645': '042025'
'1646': 042029
'1647': '042030'
'1648': '042031'
'1649': '042040'
'1650': '042044'
'1651': '042045'
'1652': '042046'
'1653': 042048
'1654': 042119
'1655': '042126'
'1656': 042129
'1657': '042135'
'1658': 042138
'1659': 042139
'1660': '042141'
'1661': '042146'
'1662': '042234'
'1663': '042235'
'1664': '042236'
'1665': 042238
'1666': '042240'
'1667': '042241'
'1668': '042243'
'1669': '042245'
'1670': '042247'
'1671': '042310'
'1672': '042372'
'1673': '042373'
'1674': '042374'
'1675': '042375'
'1676': '042376'
'1677': '042377'
'1678': '042442'
'1679': '042463'
'1680': '042475'
'1681': 042648
'1682': 042659
'1683': '042751'
'1684': '042761'
'1685': 042789
'1686': 042844
'1687': 042851
'1688': 042911
'1689': 042914
'1690': 042915
'1691': 042966
'1692': 042984
'1693': '043016'
'1694': 043018
'1695': 043019
'1696': '043020'
'1697': '043021'
'1698': '043022'
'1699': '043023'
'1700': '043024'
'1701': '043025'
'1702': '043026'
'1703': '043027'
'1704': 043028
'1705': 043029
'1706': '043030'
'1707': '043063'
'1708': '043172'
'1709': '043173'
'1710': '043516'
'1711': '043517'
'1712': 043518
'1713': 043519
'1714': '043520'
'1715': '043521'
'1716': '043533'
'1717': '043534'
'1718': '043535'
'1719': '043536'
'1720': 043585
'1721': 043586
'1722': 043587
'1723': 043588
'1724': 043589
'1725': 043590
'1726': 043592
'1727': 043593
'1728': 043594
'1729': 043595
'1730': 043596
'1731': 043598
'1732': 043599
'1733': '043600'
'1734': 043608
'1735': '043621'
'1736': '043623'
'1737': 043691
'1738': 043695
'1739': 043696
'1740': 043697
'1741': 043698
'1742': 043699
'1743': '043761'
'1744': '043765'
'1745': '043766'
'1746': '043767'
'1747': 043768
'1748': '043773'
'1749': 043796
'1750': 043842
'1751': 043843
'1752': 043844
'1753': 043856
'1754': 043857
'1755': 043858
'1756': 043859
'1757': 043860
'1758': 043861
'1759': 043863
'1760': 043865
'1761': 043866
'1762': 043867
'1763': 043868
'1764': 043869
'1765': 043883
'1766': 043886
'1767': 043899
'1768': 043911
'1769': 043962
'1770': 043965
'1771': 044092
'1772': '044110'
'1773': 044169
'1774': '044236'
'1775': '044342'
'1776': '044347'
'1777': '044354'
'1778': '044355'
'1779': '044777'
'1780': 044778
'1781': 044779
'1782': 044780
'1783': 044781
'1784': 044782
'1785': 044791
'1786': 044792
'1787': 044793
'1788': 044794
'1789': 044795
'1790': 044796
'1791': 044797
'1792': 044798
'1793': 044799
'1794': 044800
'1795': 044801
'1796': 044802
'1797': 044803
'1798': 044804
'1799': 044805
'1800': 044806
'1801': 044809
'1802': 044820
'1803': 044821
'1804': 044822
'1805': 044823
'1806': 044848
'1807': 044849
'1808': 044850
'1809': 044851
'1810': 044853
'1811': 044854
'1812': 044917
'1813': 044918
'1814': 044946
'1815': 044947
'1816': 044948
'1817': 044949
'1818': 044950
'1819': 044951
'1820': 044952
'1821': '045055'
'1822': 045099
'1823': '045100'
'1824': '045101'
'1825': '045102'
'1826': '045103'
'1827': 045119
'1828': '045122'
'1829': '045125'
'1830': '045126'
'1831': '045127'
'1832': 045128
'1833': 045149
'1834': '045150'
'1835': '045151'
'1836': '045152'
'1837': '045153'
'1838': '045154'
'1839': '045335'
'1840': 045387
'1841': 045388
'1842': 045389
'1843': 045390
'1844': 045391
'1845': 045392
'1846': 045393
'1847': '045474'
'1848': '045475'
'1849': 045508
'1850': '045513'
'1851': '045514'
'1852': '045515'
'1853': '045516'
'1854': '045517'
'1855': 045518
'1856': 045519
'1857': '045520'
'1858': '045521'
'1859': '045522'
'1860': '045523'
'1861': 045934
'1862': 045941
'1863': '046024'
'1864': '046043'
'1865': 046058
'1866': 046068
'1867': 046078
'1868': 046079
'1869': '046157'
'1870': 046158
'1871': 046159
'1872': '046160'
'1873': '046161'
'1874': '046162'
'1875': 046238
'1876': '046241'
'1877': '046525'
'1878': '046611'
'1879': '046711'
'1880': '046717'
'1881': 046718
'1882': '046720'
'1883': '046726'
'1884': '046732'
'1885': '046733'
'1886': '046736'
'1887': 046839
'1888': 046840
'1889': 046841
'1890': 046842
'1891': 046844
'1892': 046846
'1893': 046854
'1894': 046855
'1895': 046928
'1896': 046930
'1897': '047032'
'1898': 047068
'1899': 047069
'1900': '047070'
'1901': '047071'
'1902': '047072'
'1903': '047073'
'1904': '047074'
'1905': '047075'
'1906': '047076'
'1907': '047077'
'1908': '047100'
'1909': 047192
'1910': 047193
'1911': 047194
'1912': 047195
'1913': 047196
'1914': 047197
'1915': 047198
'1916': 047199
'1917': '047200'
'1918': '047201'
'1919': '047202'
'1920': '047260'
'1921': '047471'
'1922': '047506'
'1923': '047510'
'1924': '047526'
'1925': 047628
'1926': '047657'
'1927': 047658
'1928': 047659
'1929': '047660'
'1930': '047661'
'1931': '047662'
'1932': '047663'
'1933': '047665'
'1934': '047666'
'1935': '047670'
'1936': '047671'
'1937': '047707'
'1938': 047826
'1939': 047835
'1940': 047865
'1941': 047868
'1942': 047894
'1943': 047895
'1944': 047896
'1945': 047897
'1946': 047916
'1947': 047921
'1948': 048015
'1949': 048042
'1950': 048043
'1951': 048044
'1952': 048046
'1953': 048269
'1954': 048293
'1955': 048307
'1956': 048317
'1957': 048367
'1958': 048368
'1959': 048369
'1960': 048437
'1961': 048439
'1962': 048440
'1963': 048442
'1964': 048443
'1965': 048444
'1966': 048446
'1967': 048450
'1968': 048452
'1969': 048453
'1970': 048454
'1971': 048456
'1972': 048457
'1973': 048462
'1974': 048463
'1975': 048464
'1976': 048465
'1977': 048466
'1978': 048488
'1979': 048489
'1980': 048491
'1981': 048492
'1982': 048493
'1983': 048494
'1984': 048763
'1985': 048808
'1986': 048815
'1987': 048861
'1988': 048862
'1989': 048863
'1990': 048864
'1991': 048865
'1992': 048931
'1993': 048990
'1994': 048999
'1995': 049029
'1996': 049030
'1997': 049039
'1998': 049061
'1999': 049062
'2000': 049064
'2001': 049066
'2002': 049067
'2003': 049068
'2004': 049070
'2005': 049071
'2006': 049072
'2007': 049073
'2008': 049394
'2009': 049401
'2010': 049407
'2011': 049408
'2012': 049441
'2013': 049473
'2014': 049476
'2015': 049477
'2016': 049478
'2017': 049479
'2018': 049812
'2019': 049817
'2020': 049842
'2021': 049843
'2022': 049844
'2023': 049845
'2024': 049846
'2025': 049847
'2026': 049848
'2027': 049849
'2028': 049856
'2029': 049857
'2030': '050264'
'2031': '050272'
'2032': '050276'
'2033': 050283
'2034': '050323'
'2035': '050444'
'2036': '050445'
'2037': '050446'
'2038': '050447'
'2039': 050448
'2040': 050449
'2041': 050539
'2042': '050543'
'2043': '050752'
'2044': '050753'
'2045': '050754'
'2046': 050836
'2047': 050952
'2048': 050955
'2049': 050956
'2050': '051004'
'2051': '051005'
'2052': '051006'
'2053': '051111'
'2054': '051112'
'2055': '051113'
'2056': '051114'
'2057': '051115'
'2058': '051117'
'2059': 051118
'2060': '051120'
'2061': '051157'
'2062': 051158
'2063': '051203'
'2064': '051260'
'2065': '051261'
'2066': '051262'
'2067': '051263'
'2068': '051265'
'2069': '051267'
'2070': 051268
'2071': 051269
'2072': '051271'
'2073': '051272'
'2074': '051273'
'2075': '051274'
'2076': '051275'
'2077': '051276'
'2078': 051278
'2079': 051291
'2080': 051292
'2081': '051301'
'2082': '051305'
'2083': '051333'
'2084': 051479
'2085': '051655'
'2086': 051659
'2087': '051661'
'2088': '051776'
'2089': 051784
'2090': 051785
'2091': 051918
'2092': 051919
'2093': 051923
'2094': 051954
'2095': 051991
'2096': 051992
'2097': 051998
'2098': 051999
'2099': '052000'
'2100': '052001'
'2101': '052034'
'2102': '052035'
'2103': '052036'
'2104': '052037'
'2105': 052039
'2106': '052040'
'2107': '052041'
'2108': '052042'
'2109': '052044'
'2110': '052045'
'2111': 052118
'2112': 052119
'2113': '052120'
'2114': '052121'
'2115': '052122'
'2116': '052123'
'2117': '052124'
'2118': '052125'
'2119': '052126'
'2120': '052127'
'2121': 052128
'2122': 052129
'2123': '052141'
'2124': '052375'
'2125': 052380
'2126': 052389
'2127': 052393
'2128': 052409
'2129': '052446'
'2130': '052447'
'2131': 052448
'2132': 052449
'2133': '052451'
'2134': '052452'
'2135': '052500'
'2136': '052501'
'2137': '052502'
'2138': 052508
'2139': '052522'
'2140': 052579
'2141': 052628
'2142': 052629
'2143': '052630'
'2144': '052631'
'2145': '052632'
'2146': '052633'
'2147': '052634'
'2148': '052635'
'2149': '052636'
'2150': '052637'
'2151': 052638
'2152': 052639
'2153': '052641'
'2154': '052642'
'2155': '052644'
'2156': '052645'
'2157': '052646'
'2158': '052647'
'2159': 052648
'2160': 052649
'2161': '052650'
'2162': 052859
'2163': 052860
'2164': 052861
'2165': 052862
'2166': 052945
'2167': 052946
'2168': 052947
'2169': 052948
'2170': 052950
'2171': 052951
'2172': 052953
'2173': 052954
'2174': 052955
'2175': '053152'
'2176': '053154'
'2177': '053156'
'2178': '053157'
'2179': 053158
'2180': 053159
'2181': '053160'
'2182': 053228
'2183': 053229
'2184': 053299
'2185': '053300'
'2186': '053301'
'2187': '053302'
'2188': 053379
'2189': 053381
'2190': '053457'
'2191': 053496
'2192': '053576'
'2193': 053578
'2194': 053586
'2195': 053587
'2196': 053588
'2197': 053589
'2198': 053591
'2199': 053592
'2200': '053675'
'2201': '053723'
'2202': '053724'
'2203': '053725'
'2204': '053726'
'2205': '053727'
'2206': 053728
'2207': 053729
'2208': 053807
'2209': 053862
'2210': 053863
'2211': 053937
'2212': 054019
'2213': '054031'
'2214': '054032'
'2215': '054033'
'2216': '054034'
'2217': '054037'
'2218': 054039
'2219': '054061'
'2220': '054062'
'2221': '054063'
'2222': '054064'
'2223': 054149
'2224': '054150'
'2225': '054151'
'2226': '054152'
'2227': '054153'
'2228': '054154'
'2229': '054155'
'2230': '054156'
'2231': 054158
'2232': 054159
'2233': '054160'
'2234': '054163'
'2235': '054234'
'2236': '054235'
'2237': '054236'
'2238': '054237'
'2239': 054297
'2240': '054335'
'2241': '054365'
'2242': '054376'
'2243': '054433'
'2244': '054436'
'2245': '054437'
'2246': 054438
'2247': '054442'
'2248': '054443'
'2249': '054463'
'2250': '054464'
'2251': '054465'
'2252': '054466'
'2253': '054467'
'2254': 054468
'2255': 054469
'2256': '054470'
'2257': '054475'
'2258': '054476'
'2259': 054479
'2260': 054480
'2261': 054481
'2262': 054482
'2263': 054496
'2264': '054554'
'2265': 054568
'2266': '054570'
'2267': '054576'
'2268': 054578
'2269': 054580
'2270': '054621'
'2271': '054623'
'2272': '054624'
'2273': '054625'
'2274': '054626'
'2275': '054662'
'2276': '054664'
'2277': '054665'
'2278': '054666'
'2279': '054667'
'2280': '054703'
'2281': 054719
'2282': '054735'
'2283': '054753'
'2284': 054874
'2285': 054942
'2286': '055076'
'2287': 055097
'2288': '055100'
'2289': '055101'
'2290': '055102'
'2291': '055113'
'2292': 055119
'2293': '055120'
'2294': '055121'
'2295': '055122'
'2296': '055123'
'2297': '055124'
'2298': 055149
'2299': 055183
'2300': 055186
'2301': '055231'
'2302': '055232'
'2303': '055233'
'2304': '055234'
'2305': '055235'
'2306': '055236'
'2307': '055237'
'2308': 055238
'2309': '055240'
'2310': '055241'
'2311': '055242'
'2312': 055285
'2313': 055286
'2314': 055287
'2315': 055288
'2316': 055289
'2317': 055290
'2318': 055291
'2319': 055292
'2320': 055293
'2321': 055294
'2322': 055295
'2323': '055402'
'2324': '055430'
'2325': '055436'
'2326': '055437'
'2327': 055480
'2328': 055481
'2329': 055549
'2330': '055572'
'2331': 055709
'2332': '055710'
'2333': '055711'
'2334': '055712'
'2335': '055713'
'2336': '055714'
'2337': '055715'
'2338': '055716'
'2339': '055717'
'2340': 055718
'2341': 055719
'2342': 055782
'2343': 055783
'2344': 055786
'2345': 055807
'2346': 055808
'2347': 055809
'2348': 055810
'2349': 055811
'2350': 055826
'2351': 055827
'2352': 055828
'2353': 055830
'2354': 055831
'2355': 055832
'2356': 055833
'2357': 055900
'2358': '056010'
'2359': '056015'
'2360': '056020'
'2361': 056028
'2362': 056029
'2363': '056030'
'2364': '056031'
'2365': '056033'
'2366': '056034'
'2367': '056036'
'2368': '056247'
'2369': 056248
'2370': 056249
'2371': '056273'
'2372': '056274'
'2373': '056275'
'2374': '056460'
'2375': '056465'
'2376': '056466'
'2377': '056467'
'2378': 056468
'2379': 056469
'2380': '056470'
'2381': '056471'
'2382': '056472'
'2383': '056474'
'2384': 056493
'2385': 056495
'2386': 056496
'2387': 056497
'2388': 056498
'2389': 056499
'2390': '056516'
'2391': '056517'
'2392': 056518
'2393': 056519
'2394': '056520'
'2395': '056521'
'2396': '056523'
'2397': '056552'
'2398': 056559
'2399': 056639
'2400': '056640'
'2401': '056641'
'2402': '056645'
'2403': '056646'
'2404': 056648
'2405': 056649
'2406': '056650'
'2407': '056651'
'2408': 056686
'2409': 056687
'2410': 056688
'2411': 056689
'2412': 056690
'2413': 056691
'2414': 056692
'2415': 056693
'2416': 056694
'2417': 056695
'2418': 056696
'2419': 056795
'2420': 056796
'2421': 056797
'2422': 056798
'2423': 056799
'2424': 056800
'2425': 056801
'2426': 056802
'2427': 056803
'2428': 056804
'2429': 056805
'2430': 056874
'2431': 056888
'2432': 056895
'2433': 056929
'2434': 057078
'2435': '057164'
'2436': '057175'
'2437': '057176'
'2438': '057177'
'2439': 057178
'2440': 057179
'2441': 057180
'2442': '057271'
'2443': '057272'
'2444': '057273'
'2445': '057274'
'2446': '057344'
'2447': '057360'
'2448': '057371'
'2449': '057417'
'2450': 057418
'2451': '057435'
'2452': '057437'
'2453': 057439
'2454': '057440'
'2455': '057442'
'2456': '057500'
'2457': '057540'
'2458': 057569
'2459': '057626'
'2460': '057627'
'2461': 057628
'2462': 057629
'2463': '057630'
'2464': 057639
'2465': '057640'
'2466': 057648
'2467': 057658
'2468': '057661'
'2469': '057662'
'2470': '057663'
'2471': '057665'
'2472': 057691
'2473': 057697
'2474': 057819
'2475': 057820
'2476': 057821
'2477': 057822
'2478': 057823
'2479': 057891
'2480': 057892
'2481': 057936
'2482': 057937
'2483': 057938
'2484': 057939
'2485': 057943
'2486': 057968
'2487': 058052
'2488': 058053
'2489': 058054
'2490': 058060
'2491': 058061
'2492': 058063
'2493': 058068
'2494': 058070
'2495': 058115
'2496': 058116
'2497': 058117
'2498': 058135
'2499': 058140
'2500': 058161
'2501': 058162
'2502': 058164
'2503': 058166
'2504': 058169
'2505': 058170
'2506': 058173
'2507': 058174
'2508': 058207
'2509': 058212
'2510': 058213
'2511': 058215
'2512': 058221
'2513': 058225
'2514': 058333
'2515': 058334
'2516': 058341
'2517': 058474
'2518': 058539
'2519': 058540
'2520': 058541
'2521': 058542
'2522': 058543
'2523': 059078
'2524': 059373
'2525': 059374
'2526': 059443
'2527': 059445
'2528': 059446
'2529': 059448
'2530': 059449
'2531': 059451
'2532': 059454
'2533': 059561
'2534': 059562
'2535': 059581
'2536': 059653
'2537': 059654
'2538': 059656
'2539': 059657
'2540': 059658
'2541': 059659
'2542': 059660
'2543': 059663
'2544': 059664
'2545': 059666
'2546': 059667
'2547': 059669
'2548': 059671
'2549': 059673
'2550': 059675
'2551': 059676
'2552': 059677
'2553': 059678
'2554': 059679
'2555': 059680
'2556': 059681
'2557': 059682
'2558': 059683
'2559': 059684
'2560': 059685
'2561': 059686
'2562': 059687
'2563': 059688
'2564': 059695
'2565': 059702
'2566': 059706
'2567': 059707
'2568': 059708
'2569': 059709
'2570': 059710
'2571': 059711
'2572': 059718
'2573': 059719
'2574': 059720
'2575': 059721
'2576': 059723
'2577': 059724
'2578': 059725
'2579': 059726
'2580': 059727
'2581': 059823
'2582': 059876
'2583': 059930
'2584': '060037'
'2585': 060038
'2586': '060041'
'2587': '060042'
'2588': '060045'
'2589': 060048
'2590': '060074'
'2591': '060143'
'2592': '060144'
'2593': '060145'
'2594': '060146'
'2595': '060170'
'2596': '060317'
'2597': '060331'
'2598': '060472'
'2599': '060474'
'2600': '060476'
'2601': '060477'
'2602': 060478
'2603': '060510'
'2604': '060533'
'2605': '060534'
'2606': '060535'
'2607': '060536'
'2608': '060537'
'2609': '060544'
'2610': '060547'
'2611': 060548
'2612': 060549
'2613': '060736'
'2614': '060753'
'2615': '060754'
'2616': '060755'
'2617': '060756'
'2618': '060757'
'2619': 060758
'2620': '060775'
'2621': '060776'
'2622': '060777'
'2623': 060857
'2624': 060864
'2625': 060865
'2626': 060871
'2627': 060872
'2628': 060873
'2629': 060874
'2630': 060875
'2631': 060994
'2632': '061006'
'2633': '061007'
'2634': 061008
'2635': '061010'
'2636': '061011'
'2637': '061012'
'2638': '061013'
'2639': 061159
'2640': '061160'
'2641': '061161'
'2642': '061172'
'2643': '061174'
'2644': '061175'
'2645': '061452'
'2646': '061453'
'2647': 061491
'2648': 061492
'2649': 061493
'2650': 061587
'2651': 061589
'2652': 061591
'2653': 061592
'2654': 061668
'2655': '061670'
'2656': 061679
'2657': '061734'
'2658': '061736'
'2659': '061742'
'2660': 061814
'2661': 061820
'2662': 061821
'2663': 061884
'2664': '062001'
'2665': '062003'
'2666': '062005'
'2667': '062007'
'2668': '062163'
'2669': '062164'
'2670': '062165'
'2671': 062180
'2672': 062183
'2673': 062184
'2674': 062185
'2675': 062186
'2676': 062187
'2677': 062188
'2678': 062189
'2679': 062190
'2680': 062191
'2681': 062192
'2682': 062193
'2683': 062194
'2684': 062195
'2685': 062196
'2686': '062337'
'2687': '062426'
'2688': '062436'
'2689': '062445'
'2690': '062446'
'2691': 062448
'2692': 062449
'2693': '062450'
'2694': '062452'
'2695': 062458
'2696': '062525'
'2697': '062526'
'2698': '062527'
'2699': 062528
'2700': 062529
'2701': '062531'
'2702': '062532'
'2703': '062533'
'2704': '062534'
'2705': 062586
'2706': 062589
'2707': 062591
'2708': 062592
'2709': 062594
'2710': 062595
'2711': 062596
'2712': '062655'
'2713': '062671'
'2714': '062742'
'2715': 062748
'2716': 062749
'2717': '062750'
'2718': '062751'
'2719': '062753'
'2720': '063043'
'2721': '063044'
'2722': '063045'
'2723': '063064'
'2724': '063065'
'2725': '063117'
'2726': 063149
'2727': 063159
'2728': '063161'
'2729': 063191
'2730': 063208
'2731': '063224'
'2732': '063226'
'2733': '063250'
'2734': '063251'
'2735': '063252'
'2736': '063253'
'2737': '063255'
'2738': '063257'
'2739': 063258
'2740': 063287
'2741': 063289
'2742': 063290
'2743': 063291
'2744': 063292
'2745': '063456'
'2746': '063457'
'2747': '063470'
'2748': '063471'
'2749': '063472'
'2750': '063626'
'2751': '063655'
'2752': '063733'
'2753': '063747'
'2754': '063755'
'2755': '063757'
'2756': '063770'
'2757': 063789
'2758': 063803
'2759': 063804
'2760': 063805
'2761': 063874
'2762': 063900
'2763': 063908
'2764': 063922
'2765': 063936
'2766': 063999
'2767': '064005'
'2768': '064006'
'2769': '064007'
'2770': 064008
'2771': 064009
'2772': '064035'
'2773': 064078
'2774': 064079
'2775': 064091
'2776': 064093
'2777': '064247'
'2778': 064248
'2779': 064249
'2780': '064252'
'2781': '064253'
'2782': '064331'
'2783': '064332'
'2784': '064333'
'2785': '064334'
'2786': 064338
'2787': '064364'
'2788': '064365'
'2789': '064366'
'2790': '064407'
'2791': 064408
'2792': 064409
'2793': '064410'
'2794': '064515'
'2795': '064516'
'2796': '064517'
'2797': 064519
'2798': '064520'
'2799': '064521'
'2800': '064522'
'2801': '064523'
'2802': '064535'
'2803': '064536'
'2804': '064537'
'2805': 064538
'2806': '064542'
'2807': '064553'
'2808': '064556'
'2809': '064567'
'2810': 064590
'2811': 064591
'2812': 064592
'2813': 064593
'2814': 064594
'2815': '064601'
'2816': '064604'
'2817': 064618
'2818': '064625'
'2819': '064626'
'2820': '064627'
'2821': 064628
'2822': 064629
'2823': '064630'
'2824': '064631'
'2825': 064659
'2826': 064787
'2827': 064788
'2828': 064789
'2829': 064796
'2830': 064809
'2831': 064834
'2832': 064840
'2833': 064841
'2834': 064854
'2835': 064855
'2836': 064856
'2837': 064857
'2838': 064858
'2839': 064859
'2840': 064860
'2841': 064861
'2842': 064862
'2843': 064863
'2844': 064864
'2845': 064865
'2846': 064866
'2847': 064893
'2848': 064895
'2849': 064896
'2850': 064918
'2851': 064919
'2852': 064988
'2853': 064989
'2854': 064990
'2855': 064991
'2856': 064992
'2857': 064993
'2858': 064994
'2859': 064995
'2860': '065037'
'2861': 065038
'2862': 065039
'2863': '065040'
'2864': '065063'
'2865': '065064'
'2866': '065073'
'2867': '065076'
'2868': '065077'
'2869': 065090
'2870': '065234'
'2871': '065265'
'2872': 065488
'2873': 065619
'2874': 065683
'2875': 065685
'2876': '065745'
'2877': '065752'
'2878': '065755'
'2879': '065756'
'2880': '065777'
'2881': 065779
'2882': 065780
'2883': 065893
'2884': 066058
'2885': '066073'
'2886': '066074'
'2887': '066075'
'2888': '066076'
'2889': 066180
'2890': 066187
'2891': 066390
'2892': 066394
'2893': '066405'
'2894': 066469
'2895': 066482
'2896': 066483
'2897': '066525'
'2898': '066534'
'2899': '066535'
'2900': '066536'
'2901': '066537'
'2902': 066538
'2903': 066539
'2904': '066636'
'2905': '066637'
'2906': 066638
'2907': '066641'
'2908': '066643'
'2909': '066644'
'2910': '066646'
'2911': 066648
'2912': 066649
'2913': '066650'
'2914': 066689
'2915': 066690
'2916': '066717'
'2917': '066757'
'2918': 066782
'2919': 066783
'2920': '067007'
'2921': '067010'
'2922': '067011'
'2923': '067016'
'2924': '067017'
'2925': '067121'
'2926': '067163'
'2927': '067232'
'2928': '067233'
'2929': '067235'
'2930': '067237'
'2931': 067308
'2932': '067330'
'2933': '067331'
'2934': '067332'
'2935': '067333'
'2936': '067334'
'2937': '067336'
'2938': '067357'
'2939': 067358
'2940': 067359
'2941': '067360'
'2942': '067361'
'2943': '067362'
'2944': '067363'
'2945': '067364'
'2946': '067365'
'2947': '067366'
'2948': '067367'
'2949': 067368
'2950': '067412'
'2951': '067457'
'2952': '067470'
'2953': '067500'
'2954': '067553'
'2955': '067556'
'2956': '067557'
'2957': 067558
'2958': 067597
'2959': 067598
'2960': '067600'
'2961': '067637'
'2962': 067638
'2963': 067639
'2964': '067640'
'2965': '067660'
'2966': '067661'
'2967': '067673'
'2968': '067707'
'2969': '067760'
'2970': '067763'
'2971': '067764'
'2972': '067765'
'2973': '067766'
'2974': 067784
'2975': 067793
'2976': 067829
'2977': 068353
'2978': 068354
'2979': 068355
'2980': 068356
'2981': 068404
'2982': 068407
'2983': 068410
'2984': 068444
'2985': 068531
'2986': 068536
'2987': 068537
'2988': 068538
'2989': 068539
'2990': 068540
'2991': 068541
'2992': 068543
'2993': 068549
'2994': 068551
'2995': 068573
'2996': 068579
'2997': 068582
'2998': 068587
'2999': 068592
'3000': 068600
'3001': 068601
'3002': 068680
'3003': 068682
'3004': 068683
'3005': 068820
'3006': 068821
'3007': 068837
'3008': 068838
'3009': 068839
'3010': 068840
'3011': 068841
'3012': 068842
'3013': 068843
'3014': 068844
'3015': 068851
'3016': 068852
'3017': 068853
'3018': 068854
'3019': 068860
'3020': 068861
'3021': 068862
'3022': 068869
'3023': 068872
'3024': 068875
'3025': 068891
'3026': 068892
'3027': 068893
'3028': 068894
'3029': 068895
'3030': 068896
'3031': 068897
'3032': 068898
'3033': 068899
'3034': 068909
'3035': 069001
'3036': 069002
'3037': 069170
'3038': 069181
'3039': 069182
'3040': 069188
'3041': 069193
'3042': 069194
'3043': 069195
'3044': 069196
'3045': 069197
'3046': 069198
'3047': 069199
'3048': 069200
'3049': 069201
'3050': 069202
'3051': 069203
'3052': 069204
'3053': 069205
'3054': 069206
'3055': 069207
'3056': 069208
'3057': 069209
'3058': 069210
'3059': 069211
'3060': 069221
'3061': 069222
'3062': 069223
'3063': 069303
'3064': 069554
'3065': 069555
'3066': 069561
'3067': 069563
'3068': 069564
'3069': 069567
'3070': 069682
'3071': 069723
'3072': 069726
'3073': 069727
'3074': 069732
'3075': 069744
'3076': 069745
'3077': 069746
'3078': 069747
'3079': 069761
'3080': 069762
'3081': 069763
'3082': 069764
'3083': 069765
'3084': 069766
'3085': 069767
'3086': 069768
'3087': 069781
'3088': 069784
'3089': 069785
'3090': 069787
'3091': 069788
'3092': 069789
'3093': 069791
'3094': 069792
'3095': 069793
'3096': 069798
'3097': 069822
'3098': 069823
'3099': 069824
'3100': 069825
'3101': 069826
'3102': 069827
'3103': 069828
'3104': 069830
'3105': 069833
'3106': 069904
'3107': 069947
'3108': 069949
'3109': 069985
'3110': '070002'
'3111': '070005'
'3112': '070174'
'3113': '070206'
'3114': '070207'
'3115': 070208
'3116': 070299
'3117': '070300'
'3118': '070301'
'3119': '070302'
'3120': '070303'
'3121': '070402'
'3122': '070403'
'3123': 070409
'3124': '070423'
'3125': '070424'
'3126': '070425'
'3127': '070426'
'3128': '070654'
'3129': '070655'
'3130': '070657'
'3131': '070660'
'3132': 070768
'3133': '070770'
'3134': '070772'
'3135': '070773'
'3136': '070774'
'3137': '070775'
'3138': 070813
'3139': 070873
'3140': 070875
'3141': 070878
'3142': 070879
'3143': 071096
'3144': '071133'
'3145': '071157'
'3146': 071158
'3147': '071172'
'3148': '071173'
'3149': '071174'
'3150': '071175'
'3151': '071216'
'3152': '071225'
'3153': 071228
'3154': '071230'
'3155': '071231'
'3156': '071240'
'3157': '071241'
'3158': '071242'
'3159': '071243'
'3160': '071244'
'3161': '071245'
'3162': '071246'
'3163': '071247'
'3164': 071248
'3165': 071249
'3166': '071250'
'3167': '071251'
'3168': '071252'
'3169': '071253'
'3170': '071254'
'3171': '071255'
'3172': '071276'
'3173': '071303'
'3174': '071304'
'3175': '071371'
'3176': '071372'
'3177': '071420'
'3178': '071503'
'3179': '071506'
'3180': '071507'
'3181': 071508
'3182': 071509
'3183': '071510'
'3184': '071511'
'3185': '071512'
'3186': '071513'
'3187': '071514'
'3188': '071515'
'3189': '071516'
'3190': '071617'
'3191': '071620'
'3192': '071622'
'3193': 071690
'3194': 071691
'3195': 071692
'3196': 071693
'3197': 071694
'3198': 071695
'3199': 071709
'3200': '071711'
'3201': '071714'
'3202': '071715'
'3203': 071719
'3204': '071721'
'3205': '071722'
'3206': 071822
'3207': 071884
'3208': 071885
'3209': 071937
'3210': 071938
'3211': '072046'
'3212': '072047'
'3213': '072050'
'3214': '072056'
'3215': 072058
'3216': 072059
'3217': '072064'
'3218': '072067'
'3219': 072068
'3220': 072069
'3221': '072070'
'3222': '072071'
'3223': '072072'
'3224': '072073'
'3225': '072074'
'3226': '072075'
'3227': '072076'
'3228': 072129
'3229': '072130'
'3230': '072131'
'3231': '072134'
'3232': '072135'
'3233': '072136'
'3234': '072146'
'3235': 072149
'3236': '072200'
'3237': '072206'
'3238': '072210'
'3239': '072215'
'3240': '072232'
'3241': '072233'
'3242': '072234'
'3243': 072287
'3244': 072288
'3245': 072289
'3246': 072290
'3247': '072456'
'3248': 072468
'3249': '072476'
'3250': '072477'
'3251': '072513'
'3252': '072514'
'3253': '072562'
'3254': '072565'
'3255': '072570'
'3256': '072604'
'3257': '072605'
'3258': '072607'
'3259': '072612'
'3260': 072738
'3261': 072781
'3262': 072782
'3263': 072783
'3264': 072784
'3265': 072785
'3266': 072786
'3267': 072787
'3268': 072788
'3269': 072789
'3270': 072790
'3271': 072926
'3272': 072927
'3273': 072928
'3274': 072930
'3275': 073087
'3276': 073099
'3277': '073100'
'3278': '073123'
'3279': '073124'
'3280': '073125'
'3281': 073169
'3282': '073170'
'3283': '073171'
'3284': '073172'
'3285': '073174'
'3286': '073175'
'3287': 073192
'3288': 073193
'3289': '073306'
'3290': 073309
'3291': 073318
'3292': '073335'
'3293': '073340'
'3294': '073341'
'3295': '073342'
'3296': '073343'
'3297': '073344'
'3298': '073363'
'3299': '073365'
'3300': '073366'
'3301': '073367'
'3302': 073368
'3303': 073369
'3304': '073370'
'3305': '073371'
'3306': '073372'
'3307': '073465'
'3308': '073466'
'3309': '073467'
'3310': 073468
'3311': 073469
'3312': 073486
'3313': 073494
'3314': 073495
'3315': 073519
'3316': '073520'
'3317': '073521'
'3318': '073522'
'3319': '073550'
'3320': '073551'
'3321': '073560'
'3322': '073561'
'3323': '073564'
'3324': '073565'
'3325': '073566'
'3326': 073568
'3327': '073572'
'3328': '073573'
'3329': 073580
'3330': 073584
'3331': 073585
'3332': 073587
'3333': 073658
'3334': '073675'
'3335': '073760'
'3336': '073761'
'3337': '073762'
'3338': '073763'
'3339': '073764'
'3340': '073765'
'3341': '073766'
'3342': '073767'
'3343': 073768
'3344': 073769
'3345': '073770'
'3346': '073771'
'3347': '073772'
'3348': '073773'
'3349': '073774'
'3350': '073775'
'3351': '073776'
'3352': '073777'
'3353': 073778
'3354': 073779
'3355': 073792
'3356': 073797
'3357': 073819
'3358': 073820
'3359': 073821
'3360': 073822
'3361': 073921
'3362': '074002'
'3363': '074302'
'3364': '074347'
'3365': 074348
'3366': '074362'
'3367': '074365'
'3368': '074370'
'3369': '074371'
'3370': '074372'
'3371': '074373'
'3372': '074374'
'3373': '074375'
'3374': '074376'
'3375': '074377'
'3376': 074378
'3377': 074380
'3378': 074381
'3379': 074382
'3380': 074383
'3381': 074384
'3382': 074385
'3383': 074386
'3384': 074387
'3385': 074388
'3386': 074389
'3387': 074390
'3388': 074391
'3389': 074392
'3390': 074393
'3391': '074421'
'3392': '074445'
'3393': '074546'
'3394': 074669
'3395': '074671'
'3396': '074706'
'3397': 074908
'3398': 074937
'3399': 074942
'3400': 074945
'3401': 074954
'3402': 074955
'3403': 074959
'3404': 074960
'3405': 075194
'3406': '075211'
'3407': '075221'
'3408': '075230'
'3409': '075304'
'3410': '075310'
'3411': '075314'
'3412': '075317'
'3413': '075371'
'3414': '075372'
'3415': '075373'
'3416': '075374'
'3417': '075375'
'3418': '075376'
'3419': '075377'
'3420': 075378
'3421': 075379
'3422': 075380
'3423': 075381
'3424': 075383
'3425': 075386
'3426': 075389
'3427': 075390
'3428': 075391
'3429': 075393
'3430': 075395
'3431': 075396
'3432': 075398
'3433': 075399
'3434': '075401'
'3435': '075403'
'3436': '075412'
'3437': '075415'
'3438': '075417'
'3439': 075418
'3440': 075419
'3441': '075420'
'3442': '075425'
'3443': '075427'
'3444': 075428
'3445': 075429
'3446': '075430'
'3447': '075431'
'3448': '075432'
'3449': '075433'
'3450': '075434'
'3451': '075435'
'3452': '075436'
'3453': '075437'
'3454': 075438
'3455': 075439
'3456': '075440'
'3457': '075441'
'3458': '075442'
'3459': '075443'
'3460': '075607'
'3461': '075612'
'3462': 075692
'3463': '075745'
'3464': '075746'
'3465': '075747'
'3466': 075748
'3467': 075749
'3468': '075750'
'3469': '075751'
'3470': '075752'
'3471': '075754'
'3472': '075755'
'3473': '075762'
'3474': '075763'
'3475': '075764'
'3476': 075782
'3477': 075783
'3478': 075784
'3479': 075785
'3480': 075786
'3481': 075787
'3482': 075788
'3483': 075844
'3484': 075862
'3485': 075866
'3486': 075869
'3487': 075883
'3488': 075903
'3489': 075908
'3490': 075925
'3491': 075926
'3492': 075927
'3493': 075928
'3494': 075929
'3495': 075930
'3496': 075931
'3497': 075932
'3498': 075933
'3499': 075935
'3500': 075936
'3501': 075937
'3502': 075975
'3503': '076036'
'3504': 076069
'3505': '076071'
'3506': '076072'
'3507': '076073'
'3508': '076074'
'3509': '076075'
'3510': '076076'
'3511': '076077'
'3512': 076078
'3513': 076079
'3514': '076121'
'3515': 076128
'3516': 076129
'3517': '076130'
'3518': '076131'
'3519': '076363'
'3520': '076375'
'3521': 076381
'3522': '076437'
'3523': '076440'
'3524': '076654'
'3525': 076659
'3526': '077517'
'3527': 077519
'3528': '077521'
'3529': '077522'
'3530': '077523'
'3531': '077564'
'3532': '077571'
'3533': '077572'
'3534': 077952
'3535': 078038
'3536': 078156
'3537': 078213
'3538': 078516
'3539': 078833
'3540': 078834
'3541': 078839
'3542': 078841
'3543': 078843
'3544': 078845
'3545': 078847
'3546': 078848
'3547': 078849
'3548': 078850
'3549': 078851
'3550': 078852
'3551': 078984
'3552': 078998
'3553': 079087
'3554': 079575
'3555': 079593
'3556': 079605
'3557': 079606
'3558': 079610
'3559': 079616
'3560': 079741
'3561': 079973
'3562': 079975
'3563': 079977
'3564': 079978
'3565': 079985
'3566': 079986
'3567': 079988
'3568': 079990
'3569': 079995
'3570': 080000
'3571': 080001
'3572': 080002
'3573': 080003
'3574': 080004
'3575': 080005
'3576': 080035
'3577': 080293
'3578': 080341
'3579': 080351
'3580': 080389
'3581': 080402
'3582': 080515
'3583': 080516
'3584': 080517
'3585': 080518
'3586': 080519
'3587': 080520
'3588': 080611
'3589': 080680
'3590': 080686
'3591': 080687
'3592': 080693
'3593': 080694
'3594': 080695
'3595': 080696
'3596': 080697
'3597': 080751
'3598': 080753
'3599': 080754
'3600': 080755
'3601': 080756
'3602': 080758
'3603': 080765
'3604': 080766
'3605': 080772
'3606': 080773
'3607': 080774
'3608': 080775
'3609': 080776
'3610': 080793
'3611': 080833
'3612': 080834
'3613': 080835
'3614': 080836
'3615': 081033
'3616': 081037
'3617': 081071
'3618': 081082
'3619': 081083
'3620': 081084
'3621': 081085
'3622': 081189
'3623': 081193
'3624': 081194
'3625': 081195
'3626': 081362
'3627': 081365
'3628': 081436
'3629': 081457
'3630': 081485
'3631': 081491
'3632': 081512
'3633': 081523
'3634': 081543
'3635': 081554
'3636': 081555
'3637': 081565
'3638': 081576
'3639': 081586
'3640': 081600
'3641': 081612
'3642': 081613
'3643': 081623
'3644': 081638
'3645': 081650
'3646': 081660
'3647': 081781
'3648': 081782
'3649': 081792
'3650': 081802
'3651': 081803
'3652': 081814
'3653': 081868
'3654': 081895
'3655': 081938
'3656': 081945
'3657': 081946
'3658': 081988
'3659': 081999
'3660': 082157
'3661': 082231
'3662': 082237
'3663': 082242
'3664': 082250
'3665': 082410
'3666': 082462
'3667': 082464
'3668': 082505
'3669': 082507
'3670': 082628
'3671': 082629
'3672': 082630
'3673': 082631
'3674': 082778
'3675': 082780
'3676': 082881
'3677': 082886
'3678': 082890
'3679': 082892
'3680': 082893
'3681': 082914
'3682': 082915
'3683': 082916
'3684': 082917
'3685': 082918
'3686': 082919
'3687': 082920
'3688': 082921
'3689': 082928
'3690': 082929
'3691': 082930
'3692': 082931
'3693': 082932
'3694': 083437
'3695': 083438
'3696': 083439
'3697': 083440
'3698': 083507
'3699': 083509
'3700': 083511
'3701': 083512
'3702': 083558
'3703': 083600
'3704': 083612
'3705': 083613
'3706': 083715
'3707': 083717
'3708': 083718
'3709': 083719
'3710': 083789
'3711': 083790
'3712': 083791
'3713': 083898
'3714': 083903
'3715': 083906
'3716': 083908
'3717': 083911
'3718': 083913
'3719': 083954
'3720': 083960
'3721': 083969
'3722': 084009
'3723': 084054
'3724': 084055
'3725': 084056
'3726': 084057
'3727': 084058
'3728': 084091
'3729': 084095
'3730': 084096
'3731': 084097
'3732': 084111
'3733': 084135
'3734': 084136
'3735': 084139
'3736': 084141
'3737': 084142
'3738': 084144
'3739': 084152
'3740': 084154
'3741': 084155
'3742': 084156
'3743': 084157
'3744': 084158
'3745': 084159
'3746': 084195
'3747': 084198
'3748': 084200
'3749': 084201
'3750': 084202
'3751': 084264
'3752': 084290
'3753': 084291
'3754': 084405
'3755': 084417
'3756': 084423
'3757': 084483
'3758': 084484
'3759': 084485
'3760': 084486
'3761': 084605
'3762': 084736
'3763': 084743
'3764': 084757
'3765': 084768
'3766': 084777
'3767': 084788
'3768': 084817
'3769': 085027
'3770': 085038
'3771': 085039
'3772': 085040
'3773': 085041
'3774': 085290
'3775': 085291
'3776': 085307
'3777': 085308
'3778': 085309
'3779': 085310
'3780': 085311
'3781': 085317
'3782': 085318
'3783': 085343
'3784': 085346
'3785': 085347
'3786': 085400
'3787': 085419
'3788': 085420
'3789': 085421
'3790': 085422
'3791': 085423
'3792': 085424
'3793': 085425
'3794': 085426
'3795': 085427
'3796': 085428
'3797': 085436
'3798': 085438
'3799': 085482
'3800': 085484
'3801': 085485
'3802': 085486
'3803': 085487
'3804': 085488
'3805': 085489
'3806': 085490
'3807': 085491
'3808': 085492
'3809': 085494
'3810': 085592
'3811': 085593
'3812': 085594
'3813': 085595
'3814': 085596
'3815': 085598
'3816': 085599
'3817': 085600
'3818': 085691
'3819': 085692
'3820': 085693
'3821': 085787
'3822': 085788
'3823': 085791
'3824': 085792
'3825': 085816
'3826': 085817
'3827': 085822
'3828': 085823
'3829': 085828
'3830': 085831
'3831': 085832
'3832': 085833
'3833': 085834
'3834': 085835
'3835': 085836
'3836': 085837
'3837': 085838
'3838': 085839
'3839': 085840
'3840': 085950
'3841': 085951
'3842': 085952
'3843': 085953
'3844': 085954
'3845': 085955
'3846': 085956
'3847': 085957
'3848': 085963
'3849': 085966
'3850': 085967
'3851': 085968
'3852': 085973
'3853': 086037
'3854': 086038
'3855': 086039
'3856': 086040
'3857': 086077
'3858': 086081
'3859': 086082
'3860': 086116
'3861': 086117
'3862': 086118
'3863': 086119
'3864': 086140
'3865': 086256
'3866': 086259
'3867': 086262
'3868': 086263
'3869': 086415
'3870': 086416
'3871': 086417
'3872': 086419
'3873': 086441
'3874': 086443
'3875': 086481
'3876': 086482
'3877': 086483
'3878': 086484
'3879': 086485
'3880': 086486
'3881': 086487
'3882': 086562
'3883': 086576
'3884': 086623
'3885': 086634
'3886': 086678
'3887': 086679
'3888': 086680
'3889': 086720
'3890': 086721
'3891': 086724
'3892': 086725
'3893': 086730
'3894': 086761
'3895': 086762
'3896': 086763
'3897': 086788
'3898': 086793
'3899': 086795
'3900': 086799
'3901': 086993
'3902': 087068
'3903': 087069
'3904': 087070
'3905': 087096
'3906': 087097
'3907': 087098
'3908': 087099
'3909': 087100
'3910': 087101
'3911': 087102
'3912': 087103
'3913': 087104
'3914': 087105
'3915': 087106
'3916': 087107
'3917': 087108
'3918': 087121
'3919': 087151
'3920': 087152
'3921': 087153
'3922': 087154
'3923': 087155
'3924': 087157
'3925': 087158
'3926': 087159
'3927': 087160
'3928': 087161
'3929': 087185
'3930': 087186
'3931': 087187
'3932': 087188
'3933': 087189
'3934': 087190
'3935': 087191
'3936': 087192
'3937': 087193
'3938': 087194
'3939': 087237
'3940': 087322
'3941': 087323
'3942': 087324
'3943': 087325
'3944': 087361
'3945': 087362
'3946': 087363
'3947': 087377
'3948': 087430
'3949': 087431
'3950': 087490
'3951': 087639
'3952': 087641
'3953': 087642
'3954': 087643
'3955': 087644
'3956': 087645
'3957': 087965
'3958': 087966
'3959': 087967
'3960': 087968
'3961': 087971
'3962': 087972
'3963': 088428
'3964': 088429
'3965': 088485
'3966': 088486
'3967': 088846
'3968': 088848
'3969': 088854
'3970': 088856
'3971': 088858
'3972': 088860
'3973': 088861
'3974': 088863
'3975': 088864
'3976': 088867
'3977': 088868
'3978': 088869
'3979': 088870
'3980': 088871
'3981': 088872
'3982': 088873
'3983': 088874
'3984': 088875
'3985': 088876
'3986': 088877
'3987': 088878
'3988': 088879
'3989': 088892
'3990': 088899
'3991': 088900
'3992': 088959
'3993': 088960
'3994': 089178
'3995': 089179
'3996': 089192
'3997': 089195
'3998': 089196
'3999': 089212
'4000': 089350
'4001': 089376
'4002': 089441
'4003': 089445
'4004': 089447
'4005': 089456
'4006': 089473
'4007': 089474
'4008': 089477
'4009': 089482
'4010': 089484
'4011': 089485
'4012': 089486
'4013': 089639
'4014': 089704
'4015': 089814
'4016': 089815
'4017': 089816
'4018': 089817
'4019': 089841
'4020': 089843
'4021': 089846
'4022': 089847
'4023': 089848
'4024': 089857
'4025': 089859
'4026': 089860
'4027': 089991
'4028': 089992
'4029': 090027
'4030': 090074
'4031': 090278
'4032': 090526
'4033': 090527
'4034': 090529
'4035': 090530
'4036': 090570
'4037': 090579
'4038': 090582
'4039': 090583
'4040': 090587
'4041': 090589
'4042': 090590
'4043': 090591
'4044': 090592
'4045': 090616
'4046': 090617
'4047': 090618
'4048': 090625
'4049': 090639
'4050': 090652
'4051': 090695
'4052': 090804
'4053': 090824
'4054': 090826
'4055': 090828
'4056': 090982
'4057': 090987
'4058': 090993
'4059': 091081
'4060': 091082
'4061': 091083
'4062': 091084
'4063': 091085
'4064': 091086
'4065': 091087
'4066': 091088
'4067': 091089
'4068': 091092
'4069': 091093
'4070': 091098
'4071': 091102
'4072': 091130
'4073': 091157
'4074': 091158
'4075': 091159
'4076': 091160
'4077': 091161
'4078': 091162
'4079': 091163
'4080': 091164
'4081': 091170
'4082': 091177
'4083': 091178
'4084': 091179
'4085': 091181
'4086': 091182
'4087': 091183
'4088': 091184
'4089': 091185
'4090': 091186
'4091': 091187
'4092': 091205
'4093': 091228
'4094': 091238
'4095': 091306
'4096': 091309
'4097': 091312
'4098': 091315
'4099': 091317
'4100': 091318
'4101': 091319
'4102': 091329
'4103': 091349
'4104': 091443
'4105': 091455
'4106': 091458
'4107': 091459
'4108': 091468
'4109': 091471
'4110': 091619
'4111': 091620
'4112': 091621
'4113': 091622
'4114': 091623
'4115': 091624
'4116': 091625
'4117': 091755
'4118': 091788
'4119': 091790
'4120': 091791
'4121': 091793
'4122': 091796
'4123': 091797
'4124': 091851
'4125': 091868
'4126': 091869
'4127': 091894
'4128': 091897
'4129': 091899
'4130': 091900
'4131': 091933
'4132': 091934
'4133': 091936
'4134': 091937
'4135': 091938
'4136': 091958
'4137': 091960
'4138': 092124
'4139': 092125
'4140': 092129
'4141': 092130
'4142': 092131
'4143': 092206
'4144': 092275
'4145': 092282
'4146': 092283
'4147': 092284
'4148': 092292
'4149': 092366
'4150': 092466
'4151': 092508
'4152': 092535
'4153': 092536
'4154': 092538
'4155': 092539
'4156': 092540
'4157': 092546
'4158': 092548
'4159': 092549
'4160': 092551
'4161': 092554
'4162': 092556
'4163': 092561
'4164': 092562
'4165': 092564
'4166': 092565
'4167': 092573
'4168': 092574
'4169': 092868
'4170': 092872
'4171': 092873
'4172': 092874
'4173': 092878
'4174': 092881
'4175': 092885
'4176': 092886
'4177': 092887
'4178': 092888
'4179': 092889
'4180': 092947
'4181': 092948
'4182': 092949
'4183': 092950
'4184': 092951
'4185': 092952
'4186': 092953
'4187': 092954
'4188': 092955
'4189': 093074
'4190': 093075
'4191': 093076
'4192': 093363
'4193': 093364
'4194': 093518
'4195': 093519
'4196': 093520
'4197': 093521
'4198': 093522
'4199': 093523
'4200': 093704
'4201': 093710
'4202': 093712
'4203': 093716
'4204': 093727
'4205': 093867
'4206': 093868
'4207': 093915
'4208': 093917
'4209': 093918
'4210': 093919
'4211': 093920
'4212': 093921
'4213': 093940
'4214': 093941
'4215': 093942
'4216': 093943
'4217': 093944
'4218': 093950
'4219': 093956
'4220': 093981
'4221': 093983
'4222': 093985
'4223': 093986
'4224': 094026
'4225': 094033
'4226': 094034
'4227': 094035
'4228': 094036
'4229': 094037
'4230': 094038
'4231': 094039
'4232': 094093
'4233': 094099
'4234': 094101
'4235': 094102
'4236': 094263
'4237': 094348
'4238': 094411
'4239': 094414
'4240': 094415
'4241': 094419
'4242': 094422
'4243': 094423
'4244': 094426
'4245': 094449
'4246': 094465
'4247': 094467
'4248': 094468
'4249': 094628
'4250': 094630
'4251': 094631
'4252': 094632
'4253': 094634
'4254': 094635
'4255': 094638
'4256': 094803
'4257': 095189
'4258': 095231
'4259': 095248
'4260': 095249
'4261': 095250
'4262': 095251
'4263': 095308
'4264': 095309
'4265': 095310
'4266': 095452
'4267': 095486
'4268': 095506
'4269': 095535
'4270': 095564
'4271': 095722
'4272': 095724
'4273': 095725
'4274': 095726
'4275': 095727
'4276': 095908
'4277': 095910
'4278': 095911
'4279': 095912
'4280': 095914
'4281': 095915
'4282': 096166
'4283': 096167
'4284': 096168
'4285': 096169
'4286': 096399
'4287': 096400
'4288': 096401
'4289': 096402
'4290': 096403
'4291': 096408
'4292': 096560
'4293': 096627
'4294': 096657
'4295': 096675
'4296': 096678
'4297': 096692
'4298': 096693
'4299': 096694
'4300': 096695
'4301': 096696
'4302': 096697
'4303': 096698
'4304': 096699
'4305': 096718
'4306': 096726
'4307': 096728
'4308': 096729
'4309': 096730
'4310': 096731
'4311': 096738
'4312': 096742
'4313': 096743
'4314': 096759
'4315': 096898
'4316': 096900
'4317': 096901
'4318': 096902
'4319': 096935
'4320': 096936
'4321': 096944
'4322': 096945
'4323': 096946
'4324': 097037
'4325': 097041
'4326': 097043
'4327': 097211
'4328': 097215
'4329': 097216
'4330': 097279
'4331': 097283
'4332': 097285
'4333': 097286
'4334': 097373
'4335': 097374
'4336': 097393
'4337': 097404
'4338': 097406
'4339': 097407
'4340': 097424
'4341': 097540
'4342': 097542
'4343': 097544
'4344': 097545
'4345': 097547
'4346': 097548
'4347': 097568
'4348': 097569
'4349': 097570
'4350': 097585
'4351': 097586
'4352': 097587
'4353': 097588
'4354': 097589
'4355': 097590
'4356': 097690
'4357': 097691
'4358': 097692
'4359': 097697
'4360': 097793
'4361': 097794
'4362': 097813
'4363': 097814
'4364': 097841
'4365': 097844
'4366': 097845
'4367': 097846
'4368': 097847
'4369': 097848
'4370': 097886
'4371': 097887
'4372': 097894
'4373': 097940
'4374': 097958
'4375': 097959
'4376': 097960
'4377': 097961
'4378': 097962
'4379': 097980
'4380': 097986
'4381': 097987
'4382': 097988
'4383': 097989
'4384': 098025
'4385': 098026
'4386': 098028
'4387': 098031
'4388': 098077
'4389': 098202
'4390': 098203
'4391': 098204
'4392': 098205
'4393': 098206
'4394': 098227
'4395': 098228
'4396': 098229
'4397': 098235
'4398': 098236
'4399': 098237
'4400': 098238
'4401': 098251
'4402': 098297
'4403': 098298
'4404': 098299
'4405': 098300
'4406': 098301
'4407': 098302
'4408': 098339
'4409': 098346
'4410': 098348
'4411': 098349
'4412': 098547
'4413': 098548
'4414': 098549
'4415': 098550
'4416': 098551
'4417': 098552
'4418': 098553
'4419': 098554
'4420': 098555
'4421': 098556
'4422': 098557
'4423': 098565
'4424': 098567
'4425': 098569
'4426': 098573
'4427': 098574
'4428': 098575
'4429': 098576
'4430': 098577
'4431': 098578
'4432': 098579
'4433': 098580
'4434': 098581
'4435': 098582
'4436': 098583
'4437': 098584
'4438': 098585
'4439': 098613
'4440': 098617
'4441': 098618
'4442': 098619
'4443': 098620
'4444': 098621
'4445': 098622
'4446': 098623
'4447': 098624
'4448': 098625
'4449': 098626
'4450': 098627
'4451': 098628
'4452': 098655
'4453': 098656
'4454': 098657
'4455': 098666
'4456': 098667
'4457': 098668
'4458': 098669
'4459': 098670
'4460': 098671
'4461': 098680
'4462': 098681
'4463': 098701
'4464': 098770
'4465': 098838
'4466': 099041
'4467': 099093
'4468': 099095
'4469': 099096
'4470': 099135
'4471': 099214
'4472': 099260
'4473': 099261
'4474': 099274
'4475': 099311
'4476': 099313
'4477': 099345
'4478': 099361
'4479': 099362
'4480': 099363
'4481': 099364
'4482': 099368
'4483': 099369
'4484': 099370
'4485': 099371
'4486': 099372
'4487': 099373
'4488': 099374
'4489': 099375
'4490': 099389
'4491': 099390
'4492': 099391
'4493': 099392
'4494': 099393
'4495': 099394
'4496': 099395
'4497': 099411
'4498': 099419
'4499': 099436
'4500': 099437
'4501': 099438
'4502': 099439
'4503': 099440
'4504': 099441
'4505': 099442
'4506': 099501
'4507': 099703
'4508': 099704
'4509': 099707
'4510': '100478'
'4511': '100479'
'4512': '100480'
'4513': '100497'
'4514': '100522'
'4515': '100535'
'4516': '100536'
'4517': '100544'
'4518': '100549'
'4519': '100550'
'4520': '100552'
'4521': '100745'
'4522': '100799'
'4523': '100802'
'4524': '100835'
'4525': '100949'
'4526': '100958'
'4527': '100959'
'4528': '100972'
'4529': '100973'
'4530': '100975'
'4531': '100976'
'4532': '101111'
'4533': '101112'
'4534': '101116'
'4535': '101118'
'4536': '101119'
'4537': '101864'
'4538': '101868'
'4539': '101873'
'4540': '101893'
'4541': '101951'
'4542': '102092'
'4543': '102112'
'4544': '102114'
'4545': '102195'
'4546': '103518'
'4547': '103519'
'4548': '103520'
'4549': '103521'
'4550': '103522'
'4551': '103523'
'4552': '103600'
'4553': '103800'
'4554': '103808'
'4555': '104008'
'4556': '104009'
'4557': '104010'
'4558': '104062'
'4559': '104063'
'4560': '104064'
'4561': '104065'
'4562': '104066'
'4563': '104067'
'4564': '104068'
'4565': '104086'
'4566': '104227'
'4567': '104276'
'4568': '104277'
'4569': '104278'
'4570': '104279'
'4571': '104282'
'4572': '104283'
'4573': '104284'
'4574': '104356'
'4575': '104357'
'4576': '104434'
'4577': '104625'
'4578': '104668'
'4579': '104724'
'4580': '104725'
'4581': '104779'
'4582': '104780'
'4583': '105022'
'4584': '105119'
'4585': '105141'
'4586': '105142'
'4587': '105144'
'4588': '105145'
'4589': '105196'
'4590': '105408'
'4591': '105411'
'4592': '105412'
'4593': '105413'
'4594': '105414'
'4595': '105443'
'4596': '105450'
'4597': '105451'
'4598': '105662'
'4599': '105664'
'4600': '105670'
'4601': '105671'
'4602': '105672'
'4603': '105673'
'4604': '105674'
'4605': '105682'
'4606': '105683'
'4607': '105685'
'4608': '105712'
'4609': '105713'
'4610': '105714'
'4611': '105715'
'4612': '105716'
'4613': '105717'
'4614': '105718'
'4615': '105719'
'4616': '105720'
'4617': '105722'
'4618': '105824'
'4619': '105825'
'4620': '105826'
'4621': '105827'
'4622': '105887'
'4623': '105890'
'4624': '105912'
'4625': '105914'
'4626': '105915'
'4627': '105916'
'4628': '105917'
'4629': '105918'
'4630': '105919'
'4631': '105920'
'4632': '106274'
'4633': '106277'
'4634': '106339'
'4635': '106342'
'4636': '106343'
'4637': '106456'
'4638': '106457'
'4639': '106458'
'4640': '106463'
'4641': '106465'
'4642': '106502'
'4643': '106522'
'4644': '106562'
'4645': '106563'
'4646': '106564'
'4647': '106566'
'4648': '106567'
'4649': '106568'
'4650': '106569'
'4651': '106570'
'4652': '106571'
'4653': '106629'
'4654': '106872'
'4655': '106876'
'4656': '106877'
'4657': '106937'
'4658': '106948'
'4659': '106951'
'4660': '106952'
'4661': '106953'
'4662': '106954'
'4663': '106955'
'4664': '106956'
'4665': '107020'
'4666': '107021'
'4667': '107025'
'4668': '107027'
'4669': '107028'
'4670': '107029'
'4671': '107030'
'4672': '107031'
'4673': '107046'
'4674': '107047'
'4675': '107048'
'4676': '107049'
'4677': '107050'
'4678': '107101'
'4679': '107125'
'4680': '107126'
'4681': '107127'
'4682': '107128'
'4683': '107129'
'4684': '107178'
'4685': '107179'
'4686': '107180'
'4687': '107181'
'4688': '107182'
'4689': '107183'
'4690': '107184'
'4691': '107185'
'4692': '107186'
'4693': '107187'
'4694': '107188'
'4695': '107189'
'4696': '107248'
'4697': '107249'
'4698': '107250'
'4699': '107251'
'4700': '107256'
'4701': '107257'
'4702': '107388'
'4703': '107389'
'4704': '107390'
'4705': '107391'
'4706': '107425'
'4707': '107426'
'4708': '107427'
'4709': '107429'
'4710': '107432'
'4711': '107433'
'4712': '107434'
'4713': '107435'
'4714': '107476'
'4715': '107506'
'4716': '107531'
'4717': '107532'
'4718': '107533'
'4719': '107534'
'4720': '107535'
'4721': '107567'
'4722': '107569'
'4723': '107571'
'4724': '107574'
'4725': '107577'
'4726': '107578'
'4727': '107579'
'4728': '107583'
'4729': '107584'
'4730': '107588'
'4731': '107589'
'4732': '107590'
'4733': '107591'
'4734': '107592'
'4735': '107593'
'4736': '107594'
'4737': '107595'
'4738': '107596'
'4739': '107597'
'4740': '107598'
'4741': '107613'
'4742': '107616'
'4743': '107617'
'4744': '107659'
'4745': '107799'
'4746': '107804'
'4747': '107805'
'4748': '107809'
'4749': '107810'
'4750': '107850'
'4751': '107851'
'4752': '107852'
'4753': '107908'
'4754': '107909'
'4755': '107910'
'4756': '107911'
'4757': '107912'
'4758': '107913'
'4759': '107949'
'4760': '107950'
'4761': '107951'
'4762': '107952'
'4763': '107953'
'4764': '107954'
'4765': '107955'
'4766': '107956'
'4767': '107957'
'4768': '108012'
'4769': '108014'
'4770': '108015'
'4771': '108016'
'4772': '108017'
'4773': '108018'
'4774': '108019'
'4775': '108020'
'4776': '108021'
'4777': '108022'
'4778': '108023'
'4779': '108024'
'4780': '108025'
'4781': '108026'
'4782': '108027'
'4783': '108031'
'4784': '108036'
'4785': '108037'
'4786': '108038'
'4787': '108049'
'4788': '108050'
'4789': '108059'
'4790': '108060'
'4791': '108079'
'4792': '108155'
'4793': '108230'
'4794': '108290'
'4795': '108297'
'4796': '108298'
'4797': '108299'
'4798': '108300'
'4799': '108301'
'4800': '108302'
'4801': '108303'
'4802': '108304'
'4803': '108305'
'4804': '108306'
'4805': '108307'
'4806': '108308'
'4807': '108313'
'4808': '108314'
'4809': '108318'
'4810': '108319'
'4811': '108339'
'4812': '108341'
'4813': '108342'
'4814': '108343'
'4815': '108415'
'4816': '108416'
'4817': '108418'
'4818': '108420'
'4819': '108421'
'4820': '108422'
'4821': '108423'
'4822': '108425'
'4823': '108426'
'4824': '108427'
'4825': '108428'
'4826': '108429'
'4827': '108456'
'4828': '108457'
'4829': '108459'
'4830': '108460'
'4831': '108461'
'4832': '108464'
'4833': '108471'
'4834': '108472'
'4835': '108473'
'4836': '108474'
'4837': '108475'
'4838': '108476'
'4839': '108477'
'4840': '108478'
'4841': '108487'
'4842': '108488'
'4843': '108489'
'4844': '108490'
'4845': '108491'
'4846': '108492'
'4847': '108493'
'4848': '108494'
'4849': '108495'
'4850': '108496'
'4851': '108497'
'4852': '108498'
'4853': '108499'
'4854': '108500'
'4855': '108501'
'4856': '108502'
'4857': '108503'
'4858': '108504'
'4859': '108505'
'4860': '108524'
'4861': '108525'
'4862': '108526'
'4863': '108527'
'4864': '108528'
'4865': '108529'
'4866': '108530'
'4867': '108531'
'4868': '108532'
'4869': '108533'
'4870': '108745'
'4871': '108774'
'4872': '108799'
'4873': '108808'
'4874': '108809'
'4875': '108812'
'4876': '108836'
'4877': '108837'
'4878': '108838'
'4879': '108839'
'4880': '108840'
'4881': '108841'
'4882': '108842'
'4883': '108843'
'4884': '108845'
'4885': '108846'
'4886': '108847'
'4887': '108863'
'4888': '108864'
'4889': '108865'
'4890': '108866'
'4891': '108867'
'4892': '108868'
'4893': '108878'
'4894': '108879'
'4895': '108880'
'4896': '108881'
'4897': '108882'
'4898': '108883'
'4899': '108884'
'4900': '108885'
'4901': '108906'
'4902': '108957'
'4903': '108961'
'4904': '108962'
'4905': '108967'
'4906': '108968'
'4907': '108969'
'4908': '108970'
'4909': '108992'
'4910': '109068'
'4911': '109071'
'4912': '109072'
'4913': '109106'
'4914': '109144'
'4915': '109189'
'4916': '109191'
'4917': '109203'
'4918': '109235'
'4919': '109276'
'4920': '109349'
'4921': '109350'
'4922': '109355'
'4923': '109356'
'4924': '109357'
'4925': '109445'
'4926': '109446'
'4927': '109447'
'4928': '109448'
'4929': '109449'
'4930': '109450'
'4931': '109468'
'4932': '109480'
'4933': '109481'
'4934': '109497'
'4935': '109535'
'4936': '109537'
'4937': '109538'
'4938': '109542'
'4939': '109543'
'4940': '109548'
'4941': '109670'
'4942': '109681'
'4943': '109684'
'4944': '109685'
'4945': '109686'
'4946': '109687'
'4947': '109711'
'4948': '109712'
'4949': '109896'
'4950': '109900'
'4951': '109901'
'4952': '109902'
'4953': '109903'
'4954': '109904'
'4955': '109905'
'4956': '109906'
'4957': '109925'
'4958': '109957'
'4959': '109958'
'4960': '109960'
'4961': '109962'
'4962': '109963'
'4963': '109971'
'4964': '109972'
'4965': '109973'
'4966': '109974'
'4967': '109975'
'4968': '109976'
'4969': '109977'
'4970': '109978'
'4971': '110070'
'4972': '110082'
'4973': '110084'
'4974': '110085'
'4975': '110086'
'4976': '110102'
'4977': '110103'
'4978': '110104'
'4979': '110105'
'4980': '110106'
'4981': '110107'
'4982': '110108'
'4983': '110109'
'4984': '110110'
'4985': '110111'
'4986': '110166'
'4987': '110167'
'4988': '110171'
'4989': '110172'
'4990': '110204'
'4991': '110205'
'4992': '110206'
'4993': '110207'
'4994': '110208'
'4995': '110209'
'4996': '110230'
'4997': '110259'
'4998': '110260'
'4999': '110261'
'5000': '110262'
'5001': '110263'
'5002': '110264'
'5003': '110265'
'5004': '110266'
'5005': '110267'
'5006': '110274'
'5007': '110384'
'5008': '110410'
'5009': '110417'
'5010': '110436'
'5011': '110437'
'5012': '110438'
'5013': '110439'
'5014': '110440'
'5015': '110441'
'5016': '110447'
'5017': '110448'
'5018': '110449'
'5019': '110450'
'5020': '110451'
'5021': '110452'
'5022': '110546'
'5023': '110610'
'5024': '110611'
'5025': '110623'
'5026': '110629'
'5027': '110630'
'5028': '110634'
'5029': '110636'
'5030': '110637'
'5031': '110647'
'5032': '110648'
'5033': '110649'
'5034': '110650'
'5035': '110651'
'5036': '110652'
'5037': '110653'
'5038': '110654'
'5039': '110681'
'5040': '110684'
'5041': '110687'
'5042': '110688'
'5043': '110689'
'5044': '110690'
'5045': '110691'
'5046': '110711'
'5047': '110735'
'5048': '110736'
'5049': '110743'
'5050': '110744'
'5051': '110756'
'5052': '110764'
'5053': '110765'
'5054': '110768'
'5055': '110771'
'5056': '110772'
'5057': '110774'
'5058': '110775'
'5059': '110776'
'5060': '110777'
'5061': '110778'
'5062': '110779'
'5063': '110923'
'5064': '110927'
'5065': '110928'
'5066': '110980'
'5067': '110982'
'5068': '110983'
'5069': '110985'
'5070': '111015'
'5071': '111146'
'5072': '111147'
'5073': '111148'
'5074': '111149'
'5075': '111150'
'5076': '111151'
'5077': '111153'
'5078': '111154'
'5079': '111182'
'5080': '111186'
'5081': '111187'
'5082': '111188'
'5083': '111216'
'5084': '111220'
'5085': '111221'
'5086': '111222'
'5087': '111223'
'5088': '111224'
'5089': '111225'
'5090': '111226'
'5091': '111227'
'5092': '111228'
'5093': '111229'
'5094': '111230'
'5095': '111306'
'5096': '111311'
'5097': '111335'
'5098': '111367'
'5099': '111368'
'5100': '111371'
'5101': '111372'
'5102': '111375'
'5103': '111376'
'5104': '111377'
'5105': '111378'
'5106': '111379'
'5107': '111382'
'5108': '111385'
'5109': '111386'
'5110': '111387'
'5111': '111388'
'5112': '111389'
'5113': '111390'
'5114': '111391'
'5115': '111392'
'5116': '111393'
'5117': '111394'
'5118': '111395'
'5119': '111396'
'5120': '111397'
'5121': '111398'
'5122': '111399'
'5123': '111400'
'5124': '111401'
'5125': '111402'
'5126': '111413'
'5127': '111416'
'5128': '111460'
'5129': '111579'
'5130': '111658'
'5131': '111747'
'5132': '111793'
'5133': '111819'
'5134': '111871'
'5135': '111872'
'5136': '111873'
'5137': '111911'
'5138': '111933'
'5139': '111934'
'5140': '111935'
'5141': '111936'
'5142': '111937'
'5143': '111938'
'5144': '111974'
'5145': '111982'
'5146': '111994'
'5147': '112000'
'5148': '112001'
'5149': '112020'
'5150': '112065'
'5151': '112066'
'5152': '112088'
'5153': '112133'
'5154': '112196'
'5155': '112197'
'5156': '112198'
'5157': '112199'
'5158': '112209'
'5159': '112210'
'5160': '112211'
'5161': '112215'
'5162': '112252'
'5163': '112314'
'5164': '112315'
'5165': '112316'
'5166': '112317'
'5167': '112318'
'5168': '112468'
'5169': '112481'
'5170': '112483'
'5171': '112484'
'5172': '112485'
'5173': '112486'
'5174': '112487'
'5175': '112488'
'5176': '112490'
'5177': '112526'
'5178': '112527'
'5179': '112528'
'5180': '112529'
'5181': '112583'
'5182': '112584'
'5183': '112585'
'5184': '112586'
'5185': '112587'
'5186': '112588'
'5187': '112668'
'5188': '112733'
'5189': '112734'
'5190': '112735'
'5191': '112767'
'5192': '112768'
'5193': '112769'
'5194': '112770'
'5195': '112780'
'5196': '112781'
'5197': '112785'
'5198': '112788'
'5199': '112789'
'5200': '112790'
'5201': '112821'
'5202': '112975'
'5203': '112976'
'5204': '112977'
'5205': '112978'
'5206': '113016'
'5207': '113017'
'5208': '113018'
'5209': '113019'
'5210': '113020'
'5211': '113021'
'5212': '113022'
'5213': '113023'
'5214': '113024'
'5215': '113025'
'5216': '113026'
'5217': '113027'
'5218': '113028'
'5219': '113030'
'5220': '113031'
'5221': '113032'
'5222': '113033'
'5223': '113034'
'5224': '113035'
'5225': '113036'
'5226': '113037'
'5227': '113063'
'5228': '113110'
'5229': '113164'
'5230': '113165'
'5231': '113166'
'5232': '113167'
'5233': '113203'
'5234': '113259'
'5235': '113260'
'5236': '113261'
'5237': '113262'
'5238': '113263'
'5239': '113264'
'5240': '113265'
'5241': '113266'
'5242': '113267'
'5243': '113268'
'5244': '113269'
'5245': '113270'
'5246': '113271'
'5247': '113272'
'5248': '113273'
'5249': '113274'
'5250': '113275'
'5251': '113276'
'5252': '113277'
'5253': '113278'
'5254': '113279'
'5255': '113280'
'5256': '113281'
'5257': '113282'
'5258': '113284'
'5259': '113294'
'5260': '113303'
'5261': '113304'
'5262': '113305'
'5263': '113311'
'5264': '113334'
'5265': '113335'
'5266': '113336'
'5267': '113342'
'5268': '113343'
'5269': '113344'
'5270': '113357'
'5271': '113359'
'5272': '113360'
'5273': '113453'
'5274': '113511'
'5275': '113512'
'5276': '113513'
'5277': '113530'
'5278': '113558'
'5279': '113564'
'5280': '113574'
'5281': '113696'
'5282': '113697'
'5283': '113698'
'5284': '113699'
'5285': '113700'
'5286': '113701'
'5287': '113702'
'5288': '113787'
'5289': '113788'
'5290': '113789'
'5291': '113790'
'5292': '113808'
'5293': '113809'
'5294': '113810'
'5295': '113822'
'5296': '113932'
'5297': '113933'
'5298': '113934'
'5299': '113935'
'5300': '113946'
'5301': '113949'
'5302': '113950'
'5303': '113969'
'5304': '113970'
'5305': '113971'
'5306': '113972'
'5307': '113973'
'5308': '114006'
'5309': '114007'
'5310': '114036'
'5311': '114037'
'5312': '114040'
'5313': '114041'
'5314': '114042'
'5315': '114044'
'5316': '114045'
'5317': '114047'
'5318': '114048'
'5319': '114049'
'5320': '114050'
'5321': '114051'
'5322': '114061'
'5323': '114062'
'5324': '114063'
'5325': '114064'
'5326': '114065'
'5327': '114066'
'5328': '114067'
'5329': '114069'
'5330': '114070'
'5331': '114072'
'5332': '114073'
'5333': '114074'
'5334': '114076'
'5335': '114077'
'5336': '114198'
'5337': '114199'
'5338': '114200'
'5339': '114201'
'5340': '114212'
'5341': '114222'
'5342': '114223'
'5343': '114231'
'5344': '114232'
'5345': '114233'
'5346': '114234'
'5347': '114235'
'5348': '114236'
'5349': '114237'
'5350': '114238'
'5351': '114239'
'5352': '114242'
'5353': '114245'
'5354': '114265'
'5355': '114266'
'5356': '114268'
'5357': '114272'
'5358': '114274'
'5359': '114275'
'5360': '114279'
'5361': '114282'
'5362': '114283'
'5363': '114289'
'5364': '114290'
'5365': '114291'
'5366': '114292'
'5367': '114293'
'5368': '114294'
'5369': '114295'
'5370': '114296'
'5371': '114297'
'5372': '114298'
'5373': '114371'
'5374': '114372'
'5375': '114373'
'5376': '114374'
'5377': '114375'
'5378': '114384'
'5379': '114385'
'5380': '114386'
'5381': '114387'
'5382': '114388'
'5383': '114389'
'5384': '114390'
'5385': '114391'
'5386': '114392'
'5387': '114393'
'5388': '114395'
'5389': '114396'
'5390': '114397'
'5391': '114398'
'5392': '114399'
'5393': '114400'
'5394': '114401'
'5395': '114402'
'5396': '114403'
'5397': '114404'
'5398': '114405'
'5399': '114406'
'5400': '114408'
'5401': '114409'
'5402': '114410'
'5403': '114411'
'5404': '114412'
'5405': '114413'
'5406': '114414'
'5407': '114415'
'5408': '114416'
'5409': '114430'
'5410': '114532'
'5411': '114533'
'5412': '114534'
'5413': '114535'
'5414': '114536'
'5415': '114538'
'5416': '114539'
'5417': '114541'
'5418': '114544'
'5419': '114545'
'5420': '114556'
'5421': '114558'
'5422': '114559'
'5423': '114879'
'5424': '114880'
'5425': '114884'
'5426': '114936'
'5427': '114937'
'5428': '114938'
'5429': '114939'
'5430': '114940'
'5431': '114941'
'5432': '114942'
'5433': '114943'
'5434': '114974'
'5435': '114976'
'5436': '115002'
'5437': '115011'
'5438': '115125'
'5439': '115176'
'5440': '115262'
'5441': '115263'
'5442': '115267'
'5443': '115268'
'5444': '115269'
'5445': '115271'
'5446': '115272'
'5447': '115273'
'5448': '115288'
'5449': '115289'
'5450': '115290'
'5451': '115292'
'5452': '115293'
'5453': '115294'
'5454': '115321'
'5455': '115339'
'5456': '115391'
'5457': '115392'
'5458': '115470'
'5459': '115471'
'5460': '115472'
'5461': '115473'
'5462': '115474'
'5463': '115475'
'5464': '115591'
'5465': '115592'
'5466': '115597'
'5467': '115697'
'5468': '115698'
'5469': '115699'
'5470': '115700'
'5471': '115721'
'5472': '115722'
'5473': '115723'
'5474': '115724'
'5475': '115735'
'5476': '115761'
'5477': '115762'
'5478': '115764'
'5479': '115765'
'5480': '115766'
'5481': '115767'
'5482': '115768'
'5483': '115769'
'5484': '115771'
'5485': '115772'
'5486': '115773'
'5487': '115774'
'5488': '115775'
'5489': '115811'
'5490': '115812'
'5491': '115813'
'5492': '115814'
'5493': '115815'
'5494': '115816'
'5495': '115817'
'5496': '115849'
'5497': '115850'
'5498': '115852'
'5499': '115888'
'5500': '115891'
'5501': '115892'
'5502': '115922'
'5503': '115923'
'5504': '115925'
'5505': '115926'
'5506': '115927'
'5507': '115930'
'5508': '115932'
'5509': '115935'
'5510': '115944'
'5511': '115948'
'5512': '116029'
'5513': '116068'
'5514': '116098'
'5515': '116099'
'5516': '116101'
'5517': '116116'
'5518': '116119'
'5519': '116175'
'5520': '116176'
'5521': '116177'
'5522': '116235'
'5523': '116236'
'5524': '116237'
'5525': '116238'
'5526': '116239'
'5527': '116240'
'5528': '116241'
'5529': '116242'
'5530': '116243'
'5531': '116261'
'5532': '116344'
'5533': '116345'
'5534': '116372'
'5535': '116383'
'5536': '116388'
'5537': '116389'
'5538': '116390'
'5539': '116407'
'5540': '116446'
'5541': '116447'
'5542': '116448'
'5543': '116449'
'5544': '116451'
'5545': '116452'
'5546': '116453'
'5547': '116454'
'5548': '116455'
'5549': '116456'
'5550': '116457'
'5551': '116458'
'5552': '116464'
'5553': '116465'
'5554': '116466'
'5555': '116467'
'5556': '116468'
'5557': '116487'
'5558': '116488'
'5559': '116489'
'5560': '116490'
'5561': '116491'
'5562': '116514'
'5563': '116517'
'5564': '116525'
'5565': '116526'
'5566': '116527'
'5567': '116528'
'5568': '116547'
'5569': '116549'
'5570': '116586'
'5571': '116587'
'5572': '116704'
'5573': '116706'
'5574': '116707'
'5575': '116709'
'5576': '116733'
'5577': '116735'
'5578': '116736'
'5579': '116753'
'5580': '116755'
'5581': '116756'
'5582': '116757'
'5583': '116758'
'5584': '116759'
'5585': '116760'
'5586': '116833'
'5587': '116868'
'5588': '116869'
'5589': '116870'
'5590': '116871'
'5591': '116872'
'5592': '116873'
'5593': '116874'
'5594': '116876'
'5595': '116877'
'5596': '116878'
'5597': '116879'
'5598': '116880'
'5599': '116881'
'5600': '116882'
'5601': '116883'
'5602': '117057'
'5603': '117159'
'5604': '117160'
'5605': '117161'
'5606': '117169'
'5607': '117170'
'5608': '117171'
'5609': '117172'
'5610': '117173'
'5611': '117251'
'5612': '117252'
'5613': '117253'
'5614': '117287'
'5615': '117288'
'5616': '117450'
'5617': '117472'
'5618': '117473'
'5619': '117609'
'5620': '117610'
'5621': '117611'
'5622': '117612'
'5623': '117613'
'5624': '117614'
'5625': '117626'
'5626': '117627'
'5627': '117628'
'5628': '117629'
'5629': '117630'
'5630': '117631'
'5631': '117632'
'5632': '117666'
'5633': '117667'
'5634': '117668'
'5635': '117669'
'5636': '117670'
'5637': '117846'
'5638': '117883'
'5639': '117884'
'5640': '117885'
'5641': '117886'
'5642': '117887'
'5643': '117942'
'5644': '117943'
'5645': '117944'
'5646': '117945'
'5647': '117946'
'5648': '117961'
'5649': '117966'
'5650': '117967'
'5651': '117970'
'5652': '117991'
'5653': '118000'
'5654': '118012'
'5655': '118058'
'5656': '118059'
'5657': '118060'
'5658': '118061'
'5659': '118062'
'5660': '118063'
'5661': '118068'
'5662': '118070'
'5663': '118084'
'5664': '118085'
'5665': '118087'
'5666': '118195'
'5667': '118196'
'5668': '118222'
'5669': '118223'
'5670': '118257'
'5671': '118276'
'5672': '118277'
'5673': '118279'
'5674': '118327'
'5675': '118384'
'5676': '118478'
'5677': '118484'
'5678': '118489'
'5679': '118496'
'5680': '118498'
'5681': '118499'
'5682': '118500'
'5683': '118502'
'5684': '118503'
'5685': '118504'
'5686': '118505'
'5687': '118507'
'5688': '118569'
'5689': '118618'
'5690': '118629'
'5691': '118670'
'5692': '118671'
'5693': '118672'
'5694': '118674'
'5695': '118734'
'5696': '118735'
'5697': '118738'
'5698': '118739'
'5699': '118886'
'5700': '118891'
'5701': '118920'
'5702': '118921'
'5703': '118922'
'5704': '118923'
'5705': '118950'
'5706': '118951'
'5707': '118952'
'5708': '118953'
'5709': '118954'
'5710': '118955'
'5711': '118957'
'5712': '118958'
'5713': '118972'
'5714': '118986'
'5715': '118987'
'5716': '118988'
'5717': '119025'
'5718': '119026'
'5719': '119027'
'5720': '119063'
'5721': '119086'
'5722': '119095'
'5723': '119097'
'5724': '119118'
'5725': '119134'
'5726': '119187'
'5727': '119193'
'5728': '119257'
'5729': '119369'
'5730': '119379'
'5731': '119413'
'5732': '119545'
'5733': '119569'
'5734': '119571'
'5735': '119574'
'5736': '119575'
'5737': '119578'
'5738': '119579'
'5739': '119580'
'5740': '119582'
'5741': '119583'
'5742': '119584'
'5743': '119592'
'5744': '119715'
'5745': '119719'
'5746': '119725'
'5747': '119726'
'5748': '119727'
'5749': '119745'
'5750': '119828'
'5751': '119830'
'5752': '119831'
'5753': '119893'
'5754': '119894'
'5755': '119895'
'5756': '119896'
'5757': '119897'
'5758': '119898'
'5759': '119899'
'5760': '119900'
'5761': '119901'
'5762': '119922'
'5763': '119938'
'5764': '119939'
'5765': '119940'
'5766': '119941'
'5767': '119942'
'5768': '119979'
'5769': '119985'
'5770': '119988'
'5771': '119991'
'5772': '119992'
'5773': '119993'
'5774': '119994'
'5775': '120099'
'5776': '120105'
'5777': '120109'
'5778': '120111'
'5779': '120112'
'5780': '120150'
'5781': '120160'
'5782': '120161'
'5783': '120171'
'5784': '120172'
'5785': '120177'
'5786': '120178'
'5787': '120179'
'5788': '120183'
'5789': '120184'
'5790': '120188'
'5791': '120189'
'5792': '120194'
'5793': '120196'
'5794': '120199'
'5795': '120200'
'5796': '120201'
'5797': '120203'
'5798': '120206'
'5799': '120207'
'5800': '120208'
'5801': '120296'
'5802': '120297'
'5803': '120298'
'5804': '120299'
'5805': '120300'
'5806': '120302'
'5807': '120303'
'5808': '120304'
'5809': '120305'
'5810': '120306'
'5811': '120307'
'5812': '120308'
'5813': '120309'
'5814': '120310'
'5815': '120312'
'5816': '120313'
'5817': '120314'
'5818': '120315'
'5819': '120316'
'5820': '120317'
'5821': '120318'
'5822': '120319'
'5823': '120320'
'5824': '120321'
'5825': '120322'
'5826': '120323'
'5827': '120324'
'5828': '120325'
'5829': '120326'
'5830': '120327'
'5831': '120328'
'5832': '120329'
'5833': '120330'
'5834': '120331'
'5835': '120332'
'5836': '120333'
'5837': '120462'
'5838': '120466'
'5839': '120467'
'5840': '120468'
'5841': '120469'
'5842': '120470'
'5843': '120471'
'5844': '120504'
'5845': '120513'
'5846': '120514'
'5847': '120515'
'5848': '120518'
'5849': '120769'
'5850': '120770'
'5851': '120771'
'5852': '120772'
'5853': '120773'
'5854': '120774'
'5855': '120775'
'5856': '120776'
'5857': '120777'
'5858': '120778'
'5859': '120779'
'5860': '120782'
'5861': '121251'
'5862': '121256'
'5863': '121257'
'5864': '121273'
'5865': '121288'
'5866': '121312'
'5867': '121313'
'5868': '121314'
'5869': '121315'
'5870': '121316'
'5871': '121317'
'5872': '121318'
'5873': '121319'
'5874': '121320'
'5875': '121321'
'5876': '121322'
'5877': '121323'
'5878': '121346'
'5879': '121366'
'5880': '121415'
'5881': '121449'
'5882': '121450'
'5883': '121451'
'5884': '121452'
'5885': '121453'
'5886': '121454'
'5887': '121472'
'5888': '121473'
'5889': '121474'
'5890': '121475'
'5891': '121570'
'5892': '121589'
'5893': '121590'
'5894': '121591'
'5895': '121592'
'5896': '121593'
'5897': '121594'
'5898': '121595'
'5899': '121651'
'5900': '121652'
'5901': '121653'
'5902': '121654'
'5903': '121655'
'5904': '121656'
'5905': '121657'
'5906': '121658'
'5907': '121659'
'5908': '121660'
'5909': '121661'
'5910': '121662'
'5911': '121663'
'5912': '121664'
'5913': '121665'
'5914': '121666'
'5915': '121734'
'5916': '121735'
'5917': '121736'
'5918': '121737'
'5919': '121738'
'5920': '121739'
'5921': '121740'
'5922': '121813'
'5923': '121866'
'5924': '121867'
'5925': '121869'
'5926': '121913'
'5927': '121915'
'5928': '121922'
'5929': '121926'
'5930': '121929'
'5931': '121930'
'5932': '121976'
'5933': '121985'
'5934': '121987'
'5935': '121998'
'5936': '122001'
'5937': '122003'
'5938': '122004'
'5939': '122066'
'5940': '122077'
'5941': '122079'
'5942': '122080'
'5943': '122081'
'5944': '122082'
'5945': '122083'
'5946': '122084'
'5947': '122085'
'5948': '122086'
'5949': '122087'
'5950': '122088'
'5951': '122106'
'5952': '122107'
'5953': '122132'
'5954': '122143'
'5955': '122153'
'5956': '122155'
'5957': '122166'
'5958': '122168'
'5959': '122190'
'5960': '122199'
'5961': '122201'
'5962': '122204'
'5963': '122247'
'5964': '122261'
'5965': '122352'
'5966': '122353'
'5967': '122354'
'5968': '122355'
'5969': '122356'
'5970': '122357'
'5971': '122358'
'5972': '122359'
'5973': '122360'
'5974': '122362'
'5975': '122363'
'5976': '122364'
'5977': '122365'
'5978': '122395'
'5979': '122397'
'5980': '122398'
'5981': '122399'
'5982': '122400'
'5983': '122456'
'5984': '122457'
'5985': '122472'
'5986': '122473'
'5987': '122474'
'5988': '122475'
'5989': '122498'
'5990': '122499'
'5991': '122500'
'5992': '122503'
'5993': '122504'
'5994': '122510'
'5995': '122511'
'5996': '122533'
'5997': '122534'
'5998': '122578'
'5999': '122579'
'6000': '122620'
'6001': '122621'
'6002': '122622'
'6003': '122623'
'6004': '122624'
'6005': '122625'
'6006': '122626'
'6007': '122627'
'6008': '122628'
'6009': '122630'
'6010': '122631'
'6011': '122632'
'6012': '122633'
'6013': '122634'
'6014': '122635'
'6015': '122644'
'6016': '122645'
'6017': '122646'
'6018': '122647'
'6019': '122648'
'6020': '122649'
'6021': '122650'
'6022': '122651'
'6023': '122654'
'6024': '122671'
'6025': '122673'
'6026': '122675'
'6027': '122683'
'6028': '122685'
'6029': '122686'
'6030': '122798'
'6031': '122799'
'6032': '122800'
'6033': '122803'
'6034': '122804'
'6035': '122805'
'6036': '122806'
'6037': '122807'
'6038': '122808'
'6039': '122809'
'6040': '122810'
'6041': '122832'
'6042': '122901'
'6043': '122910'
'6044': '122911'
'6045': '122932'
'6046': '122934'
'6047': '122935'
'6048': '122936'
'6049': '122959'
'6050': '122999'
'6051': '123000'
'6052': '123001'
'6053': '123002'
'6054': '123003'
'6055': '123004'
'6056': '123094'
'6057': '123096'
'6058': '123097'
'6059': '123099'
'6060': '123147'
'6061': '123273'
'6062': '123278'
'6063': '123333'
'6064': '123342'
'6065': '123427'
'6066': '123438'
'6067': '123439'
'6068': '123440'
'6069': '123441'
'6070': '123442'
'6071': '123458'
'6072': '123461'
'6073': '123467'
'6074': '123468'
'6075': '123474'
'6076': '123484'
'6077': '123485'
'6078': '123486'
'6079': '123487'
'6080': '123488'
'6081': '123490'
'6082': '123494'
'6083': '123501'
'6084': '123502'
'6085': '123503'
'6086': '123504'
'6087': '123505'
'6088': '123506'
'6089': '123509'
'6090': '123523'
'6091': '123614'
'6092': '123641'
'6093': '123645'
'6094': '123647'
'6095': '123760'
'6096': '123761'
'6097': '123762'
'6098': '123763'
'6099': '123764'
'6100': '123821'
'6101': '123825'
'6102': '123832'
'6103': '123834'
'6104': '123835'
'6105': '123866'
'6106': '123867'
'6107': '123868'
'6108': '123899'
'6109': '123932'
'6110': '123933'
'6111': '123934'
'6112': '123935'
'6113': '123936'
'6114': '123937'
'6115': '123938'
'6116': '123964'
'6117': '123965'
'6118': '123966'
'6119': '123968'
'6120': '123969'
'6121': '123970'
'6122': '123971'
'6123': '123972'
'6124': '123973'
'6125': '123974'
'6126': '123975'
'6127': '123976'
'6128': '123977'
'6129': '123978'
'6130': '123979'
'6131': '123980'
'6132': '123981'
'6133': '123986'
'6134': '124154'
'6135': '124175'
'6136': '124176'
'6137': '124177'
'6138': '124178'
'6139': '124179'
'6140': '124180'
'6141': '124181'
'6142': '124183'
'6143': '124184'
'6144': '124185'
'6145': '124186'
'6146': '124201'
'6147': '124231'
'6148': '124391'
'6149': '124392'
'6150': '124393'
'6151': '124394'
'6152': '124409'
'6153': '124411'
'6154': '124424'
'6155': '124425'
'6156': '124426'
'6157': '124460'
'6158': '124461'
'6159': '124470'
'6160': '124474'
'6161': '124477'
'6162': '124479'
'6163': '124480'
'6164': '124481'
'6165': '124482'
'6166': '124483'
'6167': '124484'
'6168': '124485'
'6169': '124509'
'6170': '124517'
'6171': '124518'
'6172': '124519'
'6173': '124554'
'6174': '124555'
'6175': '124702'
'6176': '124752'
'6177': '124753'
'6178': '124754'
'6179': '124755'
'6180': '124756'
'6181': '124870'
'6182': '124872'
'6183': '124873'
'6184': '124874'
'6185': '124875'
'6186': '124876'
'6187': '124877'
'6188': '124891'
'6189': '124892'
'6190': '124912'
'6191': '124913'
'6192': '124915'
'6193': '124916'
'6194': '124917'
'6195': '124918'
'6196': '124971'
'6197': '124992'
'6198': '124996'
'6199': '125001'
'6200': '125002'
'6201': '125003'
'6202': '125004'
'6203': '125154'
'6204': '125156'
'6205': '125157'
'6206': '125158'
'6207': '125159'
'6208': '125160'
'6209': '125161'
'6210': '125182'
'6211': '125183'
'6212': '125185'
'6213': '125186'
'6214': '125187'
'6215': '125188'
'6216': '125189'
'6217': '125190'
'6218': '125191'
'6219': '125192'
'6220': '125193'
'6221': '125194'
'6222': '125195'
'6223': '125196'
'6224': '125237'
'6225': '125238'
'6226': '125239'
'6227': '125240'
'6228': '125286'
'6229': '125287'
'6230': '125288'
'6231': '125289'
'6232': '125291'
'6233': '125293'
'6234': '125298'
'6235': '125299'
'6236': '125312'
'6237': '125313'
'6238': '125314'
'6239': '125315'
'6240': '125333'
'6241': '125337'
'6242': '125375'
'6243': '125377'
'6244': '125432'
'6245': '125551'
'6246': '125612'
'6247': '125614'
'6248': '125616'
'6249': '125617'
'6250': '125618'
'6251': '125620'
'6252': '125621'
'6253': '125622'
'6254': '125657'
'6255': '125659'
'6256': '125680'
'6257': '125681'
'6258': '125721'
'6259': '125722'
'6260': '125723'
'6261': '125774'
'6262': '125776'
'6263': '125777'
'6264': '125778'
'6265': '125779'
'6266': '125809'
'6267': '125812'
'6268': '125813'
'6269': '125814'
'6270': '125815'
'6271': '125816'
'6272': '125817'
'6273': '125818'
'6274': '125819'
'6275': '125820'
'6276': '125821'
'6277': '125822'
'6278': '125823'
'6279': '125824'
'6280': '125825'
'6281': '125826'
'6282': '125827'
'6283': '125999'
'6284': '126014'
'6285': '126015'
'6286': '126016'
'6287': '126017'
'6288': '126018'
'6289': '126047'
'6290': '126055'
'6291': '126102'
'6292': '126103'
'6293': '126104'
'6294': '126105'
'6295': '126180'
'6296': '126181'
'6297': '126182'
'6298': '126183'
'6299': '126185'
'6300': '126186'
'6301': '126187'
'6302': '126188'
'6303': '126189'
'6304': '126214'
'6305': '126215'
'6306': '126216'
'6307': '126217'
'6308': '126218'
'6309': '126219'
'6310': '126220'
'6311': '126221'
'6312': '126223'
'6313': '126224'
'6314': '126225'
'6315': '126226'
'6316': '126227'
'6317': '126229'
'6318': '126230'
'6319': '126231'
'6320': '126232'
'6321': '126233'
'6322': '126234'
'6323': '126240'
'6324': '126241'
'6325': '126242'
'6326': '126243'
'6327': '126276'
'6328': '126283'
'6329': '126289'
'6330': '126290'
'6331': '126291'
'6332': '126292'
'6333': '126294'
'6334': '126295'
'6335': '126297'
'6336': '126300'
'6337': '126316'
'6338': '126317'
'6339': '126318'
'6340': '126319'
'6341': '126320'
'6342': '126321'
'6343': '126354'
'6344': '126357'
'6345': '126362'
'6346': '126398'
'6347': '126400'
'6348': '126401'
'6349': '126402'
'6350': '126403'
'6351': '126404'
'6352': '126405'
'6353': '126406'
'6354': '126407'
'6355': '126408'
'6356': '126409'
'6357': '126410'
'6358': '126411'
'6359': '126412'
'6360': '126413'
'6361': '126414'
'6362': '126415'
'6363': '126416'
'6364': '126417'
'6365': '126425'
'6366': '126426'
'6367': '126427'
'6368': '126428'
'6369': '126429'
'6370': '126430'
'6371': '126431'
'6372': '126455'
'6373': '126489'
'6374': '126490'
'6375': '126491'
'6376': '126505'
'6377': '126506'
'6378': '126507'
'6379': '126508'
'6380': '126510'
'6381': '126512'
'6382': '126516'
'6383': '126519'
'6384': '126520'
'6385': '126521'
'6386': '126522'
'6387': '126550'
'6388': '126557'
'6389': '126559'
'6390': '126584'
'6391': '126585'
'6392': '126586'
'6393': '126587'
'6394': '126588'
'6395': '126589'
'6396': '126598'
'6397': '126600'
'6398': '126601'
'6399': '126602'
'6400': '126603'
'6401': '126605'
'6402': '126606'
'6403': '126607'
'6404': '126608'
'6405': '126646'
'6406': '126666'
'6407': '126667'
'6408': '126668'
'6409': '126669'
'6410': '126670'
'6411': '126671'
'6412': '126672'
'6413': '126673'
'6414': '126674'
'6415': '126675'
'6416': '126676'
'6417': '126716'
'6418': '126717'
'6419': '126718'
'6420': '126719'
'6421': '126720'
'6422': '126743'
'6423': '126746'
'6424': '126747'
'6425': '126748'
'6426': '126749'
'6427': '126773'
'6428': '126778'
'6429': '126781'
'6430': '126782'
'6431': '126786'
'6432': '126789'
'6433': '126790'
'6434': '126882'
'6435': '126883'
'6436': '126884'
'6437': '126885'
'6438': '126886'
'6439': '126887'
'6440': '126899'
'6441': '126900'
'6442': '126944'
'6443': '126979'
'6444': '127036'
'6445': '127037'
'6446': '127062'
'6447': '127066'
'6448': '127155'
'6449': '127159'
'6450': '127180'
'6451': '127181'
'6452': '127182'
'6453': '127183'
'6454': '127184'
'6455': '127185'
'6456': '127186'
'6457': '127187'
'6458': '127188'
'6459': '127189'
'6460': '127190'
'6461': '127191'
'6462': '127192'
'6463': '127193'
'6464': '127194'
'6465': '127203'
'6466': '127204'
'6467': '127205'
'6468': '127206'
'6469': '127207'
'6470': '127208'
'6471': '127209'
'6472': '127210'
'6473': '127211'
'6474': '127212'
'6475': '127263'
'6476': '127265'
'6477': '127266'
'6478': '127267'
'6479': '127268'
'6480': '127269'
'6481': '127271'
'6482': '127273'
'6483': '127274'
'6484': '127275'
'6485': '127276'
'6486': '127277'
'6487': '127278'
'6488': '127279'
'6489': '127280'
'6490': '127281'
'6491': '127285'
'6492': '127286'
'6493': '127287'
'6494': '127288'
'6495': '127289'
'6496': '127290'
'6497': '127294'
'6498': '127295'
'6499': '127296'
'6500': '127297'
'6501': '127298'
'6502': '127299'
'6503': '127300'
'6504': '127301'
'6505': '127302'
'6506': '127303'
'6507': '127330'
'6508': '127331'
'6509': '127339'
'6510': '127343'
'6511': '127349'
'6512': '127350'
'6513': '127356'
'6514': '127357'
'6515': '127358'
'6516': '127359'
'6517': '127360'
'6518': '127402'
'6519': '127422'
'6520': '127469'
'6521': '127484'
'6522': '127494'
'6523': '127495'
'6524': '127496'
'6525': '127497'
'6526': '127498'
'6527': '127499'
'6528': '127519'
'6529': '127520'
'6530': '127532'
'6531': '127541'
'6532': '127542'
'6533': '127559'
'6534': '127620'
'6535': '127623'
'6536': '127648'
'6537': '127660'
'6538': '127661'
'6539': '127662'
'6540': '127663'
'6541': '127720'
'6542': '127722'
'6543': '127726'
'6544': '127798'
'6545': '127804'
'6546': '127806'
'6547': '127865'
'6548': '127866'
'6549': '127867'
'6550': '127868'
'6551': '127869'
'6552': '127870'
'6553': '127871'
'6554': '127878'
'6555': '127908'
'6556': '127909'
'6557': '127910'
'6558': '127911'
'6559': '127912'
'6560': '127913'
'6561': '127914'
'6562': '127915'
'6563': '127916'
'6564': '127936'
'6565': '127996'
'6566': '128441'
'6567': '128443'
'6568': '128448'
'6569': '128469'
'6570': '128470'
'6571': '128471'
'6572': '128472'
'6573': '128473'
'6574': '128476'
'6575': '128477'
'6576': '128482'
'6577': '128484'
'6578': '128494'
'6579': '128500'
'6580': '128504'
'6581': '128619'
'6582': '128666'
'6583': '128668'
'6584': '128699'
'6585': '128709'
'6586': '128710'
'6587': '128711'
'6588': '128758'
'6589': '128759'
'6590': '128760'
'6591': '128799'
'6592': '128811'
'6593': '128812'
'6594': '128813'
'6595': '128814'
'6596': '128815'
'6597': '128816'
'6598': '128825'
'6599': '128827'
'6600': '128828'
'6601': '128835'
'6602': '128845'
'6603': '128878'
'6604': '128879'
'6605': '128880'
'6606': '128881'
'6607': '128882'
'6608': '128885'
'6609': '128886'
'6610': '128887'
'6611': '128888'
'6612': '128927'
'6613': '128992'
'6614': '129039'
'6615': '129040'
'6616': '129042'
'6617': '129043'
'6618': '129044'
'6619': '129046'
'6620': '129048'
'6621': '129049'
'6622': '129051'
'6623': '129052'
'6624': '129053'
'6625': '129054'
'6626': '129055'
'6627': '129056'
'6628': '129088'
'6629': '129089'
'6630': '129090'
'6631': '129091'
'6632': '129092'
'6633': '129093'
'6634': '129094'
'6635': '129095'
'6636': '129096'
'6637': '129097'
'6638': '129098'
'6639': '129184'
'6640': '129185'
'6641': '129186'
'6642': '129187'
'6643': '129188'
'6644': '129189'
'6645': '129190'
'6646': '129268'
'6647': '129362'
'6648': '129372'
'6649': '129374'
'6650': '129375'
'6651': '129391'
'6652': '129392'
'6653': '129393'
'6654': '129395'
'6655': '129396'
'6656': '129397'
'6657': '129398'
'6658': '129399'
'6659': '129400'
'6660': '129401'
'6661': '129402'
'6662': '129403'
'6663': '129404'
'6664': '129405'
'6665': '129406'
'6666': '129407'
'6667': '129439'
'6668': '129442'
'6669': '129444'
'6670': '129620'
'6671': '129622'
'6672': '129624'
'6673': '129674'
'6674': '129675'
'6675': '129683'
'6676': '129694'
'6677': '129695'
'6678': '129696'
'6679': '129742'
'6680': '129806'
'6681': '129807'
'6682': '129808'
'6683': '129816'
'6684': '129874'
'6685': '129875'
'6686': '129876'
'6687': '129879'
'6688': '129880'
'6689': '129882'
'6690': '129883'
'6691': '129884'
'6692': '129885'
'6693': '129886'
'6694': '129887'
'6695': '129889'
'6696': '129904'
'6697': '129910'
'6698': '129914'
'6699': '129915'
'6700': '129918'
'6701': '129919'
'6702': '129920'
'6703': '129922'
'6704': '129923'
'6705': '129924'
'6706': '129925'
'6707': '129926'
'6708': '129927'
'6709': '129962'
'6710': '129968'
'6711': '129969'
'6712': '129970'
'6713': '129972'
'6714': '129973'
'6715': '129997'
'6716': '130016'
'6717': '130084'
'6718': '130129'
'6719': '130130'
'6720': '130131'
'6721': '130132'
'6722': '130133'
'6723': '130134'
'6724': '130135'
'6725': '130136'
'6726': '130137'
'6727': '130168'
'6728': '130170'
'6729': '130218'
'6730': '130265'
'6731': '130347'
'6732': '130349'
'6733': '130367'
'6734': '130368'
'6735': '130369'
'6736': '130370'
'6737': '130371'
'6738': '130372'
'6739': '130440'
'6740': '130454'
'6741': '130456'
'6742': '130650'
'6743': '130667'
'6744': '130682'
'6745': '130683'
'6746': '130689'
'6747': '130691'
'6748': '130692'
'6749': '130693'
'6750': '130702'
'6751': '130709'
'6752': '130710'
'6753': '130711'
'6754': '130752'
'6755': '130758'
'6756': '130920'
'6757': '130921'
'6758': '130922'
'6759': '130923'
'6760': '130927'
'6761': '130929'
'6762': '130930'
'6763': '130931'
'6764': '130932'
'6765': '130933'
'6766': '130934'
'6767': '130937'
'6768': '130940'
'6769': '130944'
'6770': '130945'
'6771': '130948'
'6772': '130950'
'6773': '130951'
'6774': '130952'
'6775': '130953'
'6776': '130954'
'6777': '130955'
'6778': '130956'
'6779': '130963'
'6780': '130964'
'6781': '130986'
'6782': '130988'
'6783': '130989'
'6784': '130990'
'6785': '130991'
'6786': '130992'
'6787': '130993'
'6788': '131016'
'6789': '131019'
'6790': '131020'
'6791': '131021'
'6792': '131024'
'6793': '131166'
'6794': '131292'
'6795': '131323'
'6796': '131324'
'6797': '131325'
'6798': '131326'
'6799': '131327'
'6800': '131385'
'6801': '131410'
'6802': '131422'
'6803': '131425'
'6804': '131426'
'6805': '131436'
'6806': '131439'
'6807': '131444'
'6808': '131446'
'6809': '131448'
'6810': '131449'
'6811': '131451'
'6812': '131452'
'6813': '131453'
'6814': '131454'
'6815': '131476'
'6816': '131536'
'6817': '131540'
'6818': '131552'
'6819': '131553'
'6820': '131554'
'6821': '131567'
'6822': '131624'
'6823': '131656'
'6824': '131657'
'6825': '131658'
'6826': '131764'
'6827': '131767'
'6828': '131770'
'6829': '131771'
'6830': '131772'
'6831': '131773'
'6832': '131774'
'6833': '131787'
'6834': '131789'
'6835': '131791'
'6836': '131792'
'6837': '131794'
'6838': '131795'
'6839': '131796'
'6840': '131797'
'6841': '131837'
'6842': '131897'
'6843': '131899'
'6844': '131900'
'6845': '131901'
'6846': '131902'
'6847': '131903'
'6848': '131904'
'6849': '131911'
'6850': '131912'
'6851': '131913'
'6852': '131914'
'6853': '131917'
'6854': '131918'
'6855': '131919'
'6856': '131922'
'6857': '131923'
'6858': '131924'
'6859': '131925'
'6860': '131932'
'6861': '131933'
'6862': '131934'
'6863': '131935'
'6864': '131936'
'6865': '131938'
'6866': '131939'
'6867': '131940'
'6868': '131941'
'6869': '131942'
'6870': '131950'
'6871': '131951'
'6872': '131952'
'6873': '131953'
'6874': '131978'
'6875': '131979'
'6876': '131980'
'6877': '131982'
'6878': '131983'
'6879': '131984'
'6880': '131985'
'6881': '131986'
'6882': '132019'
'6883': '132040'
'6884': '132041'
'6885': '132042'
'6886': '132045'
'6887': '132117'
'6888': '132118'
'6889': '132122'
'6890': '132134'
'6891': '132138'
'6892': '132139'
'6893': '132140'
'6894': '132141'
'6895': '132142'
'6896': '132171'
'6897': '132272'
'6898': '132310'
'6899': '132420'
'6900': '132424'
'6901': '132434'
'6902': '132436'
'6903': '132448'
'6904': '132449'
'6905': '132453'
'6906': '132454'
'6907': '132455'
'6908': '132456'
'6909': '132561'
'6910': '132566'
'6911': '132567'
'6912': '132568'
'6913': '132589'
'6914': '132675'
'6915': '132677'
'6916': '132678'
'6917': '132679'
'6918': '132773'
'6919': '132774'
'6920': '132775'
'6921': '132778'
'6922': '132779'
'6923': '132781'
'6924': '132784'
'6925': '132786'
'6926': '132787'
'6927': '132788'
'6928': '132789'
'6929': '132790'
'6930': '132791'
'6931': '132792'
'6932': '132793'
'6933': '132794'
'6934': '132795'
'6935': '132914'
'6936': '132954'
'6937': '132961'
'6938': '132962'
'6939': '132963'
'6940': '132964'
'6941': '132965'
'6942': '133015'
'6943': '133016'
'6944': '133019'
'6945': '133020'
'6946': '133022'
'6947': '133023'
'6948': '133024'
'6949': '133025'
'6950': '133026'
'6951': '133027'
'6952': '133028'
'6953': '133029'
'6954': '133100'
'6955': '133102'
'6956': '133272'
'6957': '133273'
'6958': '133274'
'6959': '133275'
'6960': '133276'
'6961': '133293'
'6962': '133294'
'6963': '133332'
'6964': '133333'
'6965': '133431'
'6966': '133432'
'6967': '133433'
'6968': '133434'
'6969': '133435'
'6970': '133436'
'6971': '133437'
'6972': '133438'
'6973': '133439'
'6974': '133440'
'6975': '133441'
'6976': '133442'
'6977': '133443'
'6978': '133444'
'6979': '133445'
'6980': '133446'
'6981': '133447'
'6982': '133448'
'6983': '133449'
'6984': '133450'
'6985': '133451'
'6986': '133452'
'6987': '133453'
'6988': '133454'
'6989': '133455'
'6990': '133456'
'6991': '133457'
'6992': '133459'
'6993': '133479'
'6994': '133535'
'6995': '133537'
'6996': '133538'
'6997': '133544'
'6998': '133545'
'6999': '133546'
'7000': '133551'
'7001': '133553'
'7002': '133560'
'7003': '133561'
'7004': '133562'
'7005': '133563'
'7006': '133564'
'7007': '133567'
'7008': '133571'
'7009': '133572'
'7010': '133573'
'7011': '133574'
'7012': '133576'
'7013': '133579'
'7014': '133580'
'7015': '133632'
'7016': '133638'
'7017': '133639'
'7018': '133681'
'7019': '133729'
'7020': '133731'
'7021': '133770'
'7022': '133772'
'7023': '133780'
'7024': '133781'
'7025': '133788'
'7026': '133793'
'7027': '133798'
'7028': '133802'
'7029': '133803'
'7030': '133833'
'7031': '133835'
'7032': '133836'
'7033': '133837'
'7034': '133838'
'7035': '133916'
'7036': '133942'
'7037': '133943'
'7038': '133967'
'7039': '133968'
'7040': '133969'
'7041': '133970'
'7042': '133971'
'7043': '133972'
'7044': '133973'
'7045': '133974'
'7046': '133975'
'7047': '133976'
'7048': '133977'
'7049': '133978'
'7050': '134034'
'7051': '134052'
'7052': '134053'
'7053': '134054'
'7054': '134073'
'7055': '134077'
'7056': '134084'
'7057': '134094'
'7058': '134359'
'7059': '134384'
'7060': '134385'
'7061': '134388'
'7062': '134389'
'7063': '134443'
'7064': '134444'
'7065': '134445'
'7066': '134446'
'7067': '134447'
'7068': '134448'
'7069': '134449'
'7070': '134452'
'7071': '134453'
'7072': '134454'
'7073': '134455'
'7074': '134486'
'7075': '134509'
'7076': '134510'
'7077': '134580'
'7078': '134586'
'7079': '134594'
'7080': '134610'
'7081': '134631'
'7082': '134643'
'7083': '134790'
'7084': '134791'
'7085': '134792'
'7086': '134793'
'7087': '134794'
'7088': '134795'
'7089': '134796'
'7090': '134797'
'7091': '134801'
'7092': '134823'
'7093': '134824'
'7094': '134825'
'7095': '134826'
'7096': '134827'
'7097': '134918'
'7098': '134919'
'7099': '134922'
'7100': '134923'
'7101': '134928'
'7102': '134929'
'7103': '134930'
'7104': '134931'
'7105': '134932'
'7106': '134933'
'7107': '134934'
'7108': '134935'
'7109': '134936'
'7110': '134937'
'7111': '134938'
'7112': '134939'
'7113': '134940'
'7114': '134941'
'7115': '134942'
'7116': '134943'
'7117': '134947'
'7118': '134948'
'7119': '134949'
'7120': '134950'
'7121': '134951'
'7122': '134952'
'7123': '134956'
'7124': '134959'
'7125': '134962'
'7126': '134979'
'7127': '134981'
'7128': '135010'
'7129': '135028'
'7130': '135039'
'7131': '135043'
'7132': '135044'
'7133': '135054'
'7134': '135089'
'7135': '135091'
'7136': '135092'
'7137': '135219'
'7138': '135220'
'7139': '135221'
'7140': '135222'
'7141': '135223'
'7142': '135224'
'7143': '135225'
'7144': '135226'
'7145': '135227'
'7146': '135228'
'7147': '135229'
'7148': '135336'
'7149': '135337'
'7150': '135338'
'7151': '135339'
'7152': '135340'
'7153': '135341'
'7154': '135342'
'7155': '135363'
'7156': '135364'
'7157': '135365'
'7158': '135368'
'7159': '135369'
'7160': '135370'
'7161': '135371'
'7162': '135372'
'7163': '135373'
'7164': '135374'
'7165': '135375'
'7166': '135986'
'7167': '135989'
'7168': '135990'
'7169': '136054'
'7170': '136091'
'7171': '136094'
'7172': '136134'
'7173': '136137'
'7174': '136138'
'7175': '136275'
'7176': '136276'
'7177': '136320'
'7178': '136321'
'7179': '136322'
'7180': '136323'
'7181': '136324'
'7182': '136331'
'7183': '136404'
'7184': '136424'
'7185': '136449'
'7186': '136465'
'7187': '136466'
'7188': '136467'
'7189': '136468'
'7190': '136469'
'7191': '136705'
'7192': '136706'
'7193': '136707'
'7194': '136708'
'7195': '136709'
'7196': '136928'
'7197': '136994'
'7198': '136995'
'7199': '137054'
'7200': '137151'
'7201': '137152'
'7202': '137166'
'7203': '137167'
'7204': '137168'
'7205': '137169'
'7206': '137170'
'7207': '137171'
'7208': '137172'
'7209': '137173'
'7210': '137174'
'7211': '137175'
'7212': '137176'
'7213': '137211'
'7214': '137212'
'7215': '137213'
'7216': '137214'
'7217': '137356'
'7218': '137417'
'7219': '137418'
'7220': '137419'
'7221': '137423'
'7222': '137424'
'7223': '137425'
'7224': '137426'
'7225': '137462'
'7226': '137463'
'7227': '137484'
'7228': '137500'
'7229': '137551'
'7230': '137561'
'7231': '137563'
'7232': '137567'
'7233': '137593'
'7234': '137605'
'7235': '137624'
'7236': '137627'
'7237': '137630'
'7238': '137631'
'7239': '137632'
'7240': '137715'
'7241': '137716'
'7242': '137717'
'7243': '137719'
'7244': '137720'
'7245': '137721'
'7246': '137722'
'7247': '137723'
'7248': '137724'
'7249': '137725'
'7250': '137740'
'7251': '137895'
'7252': '137896'
'7253': '137898'
'7254': '137899'
'7255': '137900'
'7256': '137901'
'7257': '137907'
'7258': '137935'
'7259': '137990'
'7260': '137998'
'7261': '138010'
'7262': '138015'
'7263': '138016'
'7264': '138017'
'7265': '138018'
'7266': '138019'
'7267': '138020'
'7268': '138021'
'7269': '138022'
'7270': '138023'
'7271': '138024'
'7272': '138025'
'7273': '138026'
'7274': '138038'
'7275': '138039'
'7276': '138040'
'7277': '138041'
'7278': '138053'
'7279': '138060'
'7280': '138061'
'7281': '138062'
'7282': '138063'
'7283': '138064'
'7284': '138065'
'7285': '138066'
'7286': '138067'
'7287': '138068'
'7288': '138069'
'7289': '138070'
'7290': '138071'
'7291': '138207'
'7292': '138210'
'7293': '138211'
'7294': '138212'
'7295': '138213'
'7296': '138215'
'7297': '138216'
'7298': '138217'
'7299': '138218'
'7300': '138256'
'7301': '138282'
'7302': '138306'
'7303': '138311'
'7304': '138317'
'7305': '138318'
'7306': '138319'
'7307': '138320'
'7308': '138351'
'7309': '138355'
'7310': '138406'
'7311': '138410'
'7312': '138413'
'7313': '138414'
'7314': '138415'
'7315': '138416'
'7316': '138578'
'7317': '138579'
'7318': '138580'
'7319': '138581'
'7320': '139003'
'7321': '139043'
'7322': '139110'
'7323': '139112'
'7324': '139117'
'7325': '139123'
'7326': '139226'
'7327': '139329'
'7328': '139330'
'7329': '139461'
'7330': '139485'
'7331': '139491'
'7332': '139520'
'7333': '139521'
'7334': '139522'
'7335': '139523'
'7336': '139524'
'7337': '139532'
'7338': '139534'
'7339': '139536'
'7340': '139537'
'7341': '139637'
'7342': '139638'
'7343': '139663'
'7344': '139681'
'7345': '139687'
'7346': '139688'
'7347': '139769'
'7348': '139770'
'7349': '139771'
'7350': '139772'
'7351': '139773'
'7352': '139774'
'7353': '139775'
'7354': '139776'
'7355': '139777'
'7356': '139804'
'7357': '139862'
'7358': '139876'
'7359': '139933'
'7360': '139934'
'7361': '139935'
'7362': '139936'
'7363': '139937'
'7364': '139954'
'7365': '139990'
'7366': '139991'
'7367': '139992'
'7368': '139993'
'7369': '139994'
'7370': '139995'
'7371': '140043'
'7372': '140134'
'7373': '140135'
'7374': '140258'
'7375': '140259'
'7376': '140260'
'7377': '140261'
'7378': '140262'
'7379': '140263'
'7380': '140266'
'7381': '140316'
'7382': '140344'
'7383': '140421'
'7384': '140564'
'7385': '140565'
'7386': '140566'
'7387': '140576'
'7388': '140583'
'7389': '140584'
'7390': '140609'
'7391': '140620'
'7392': '140621'
'7393': '140623'
'7394': '140625'
'7395': '140626'
'7396': '140788'
'7397': '140789'
'7398': '140790'
'7399': '140791'
'7400': '140794'
'7401': '140871'
'7402': '140872'
'7403': '140873'
'7404': '140874'
'7405': '140875'
'7406': '140922'
'7407': '140923'
'7408': '140924'
'7409': '140925'
'7410': '140926'
'7411': '140933'
'7412': '140934'
'7413': '140935'
'7414': '140939'
'7415': '141074'
'7416': '141137'
'7417': '141139'
'7418': '141141'
'7419': '141143'
'7420': '141144'
'7421': '141164'
'7422': '141166'
'7423': '141167'
'7424': '141168'
'7425': '141173'
'7426': '141179'
'7427': '141180'
'7428': '141181'
'7429': '141182'
'7430': '141264'
'7431': '141282'
'7432': '141283'
'7433': '141284'
'7434': '141285'
'7435': '141286'
'7436': '141287'
'7437': '141288'
'7438': '141289'
'7439': '141290'
'7440': '141291'
'7441': '141292'
'7442': '141293'
'7443': '141295'
'7444': '141296'
'7445': '141297'
'7446': '141299'
'7447': '141300'
'7448': '141303'
'7449': '141304'
'7450': '141310'
'7451': '141375'
'7452': '141561'
'7453': '141562'
'7454': '141564'
'7455': '141566'
'7456': '141567'
'7457': '141568'
'7458': '141569'
'7459': '141590'
'7460': '141591'
'7461': '141592'
'7462': '141593'
'7463': '141594'
'7464': '141616'
'7465': '141617'
'7466': '141618'
'7467': '141619'
'7468': '141735'
'7469': '141873'
'7470': '141874'
'7471': '141875'
'7472': '141876'
'7473': '141877'
'7474': '141878'
'7475': '141894'
'7476': '141901'
'7477': '141902'
'7478': '141903'
'7479': '141972'
'7480': '142078'
'7481': '142079'
'7482': '142080'
'7483': '142081'
'7484': '142082'
'7485': '142083'
'7486': '142084'
'7487': '142085'
'7488': '142086'
'7489': '142087'
'7490': '142088'
'7491': '142089'
'7492': '142091'
'7493': '142092'
'7494': '142093'
'7495': '142094'
'7496': '142096'
'7497': '142097'
'7498': '142098'
'7499': '142128'
'7500': '142129'
'7501': '142132'
'7502': '142133'
'7503': '142358'
'7504': '142359'
'7505': '142360'
'7506': '142361'
'7507': '142362'
'7508': '142381'
'7509': '142402'
'7510': '142418'
'7511': '142433'
'7512': '142511'
'7513': '142516'
'7514': '142517'
'7515': '142519'
'7516': '142528'
'7517': '142529'
'7518': '142530'
'7519': '142531'
'7520': '142532'
'7521': '142533'
'7522': '142534'
'7523': '142535'
'7524': '142536'
'7525': '142537'
'7526': '142538'
'7527': '142539'
'7528': '142549'
'7529': '142550'
'7530': '142551'
'7531': '142552'
'7532': '142553'
'7533': '142563'
'7534': '142564'
'7535': '142565'
'7536': '142566'
'7537': '142567'
'7538': '142568'
'7539': '142569'
'7540': '142570'
'7541': '142571'
'7542': '142572'
'7543': '142573'
'7544': '142574'
'7545': '142575'
'7546': '142576'
'7547': '142577'
'7548': '142579'
'7549': '142641'
'7550': '142666'
'7551': '142668'
'7552': '142669'
'7553': '142670'
'7554': '142671'
'7555': '142672'
'7556': '142947'
'7557': '142948'
'7558': '142949'
'7559': '142950'
'7560': '143039'
'7561': '143046'
'7562': '143055'
'7563': '143056'
'7564': '143057'
'7565': '143058'
'7566': '143059'
'7567': '143060'
'7568': '143061'
'7569': '143095'
'7570': '143097'
'7571': '143098'
'7572': '143099'
'7573': '143106'
'7574': '143186'
'7575': '143214'
'7576': '143215'
'7577': '143216'
'7578': '143217'
'7579': '143218'
'7580': '143219'
'7581': '143220'
'7582': '143221'
'7583': '143237'
'7584': '143239'
'7585': '143290'
'7586': '143295'
'7587': '143296'
'7588': '143299'
'7589': '143300'
'7590': '143303'
'7591': '143304'
'7592': '143305'
'7593': '143306'
'7594': '143307'
'7595': '143308'
'7596': '143309'
'7597': '143318'
'7598': '143319'
'7599': '143532'
'7600': '143941'
'7601': '143989'
'7602': '143995'
'7603': '144170'
'7604': '144171'
'7605': '144172'
'7606': '144173'
'7607': '144179'
'7608': '144180'
'7609': '144181'
'7610': '144182'
'7611': '144212'
'7612': '144213'
'7613': '144214'
'7614': '144215'
'7615': '144216'
'7616': '144423'
'7617': '144424'
'7618': '144454'
'7619': '144465'
'7620': '144466'
'7621': '144467'
'7622': '144468'
'7623': '144469'
'7624': '144470'
'7625': '144471'
'7626': '144472'
'7627': '144473'
'7628': '144474'
'7629': '144475'
'7630': '144476'
'7631': '144477'
'7632': '144487'
'7633': '144492'
'7634': '144542'
'7635': '144543'
'7636': '144544'
'7637': '144545'
'7638': '144546'
'7639': '144547'
'7640': '144548'
'7641': '144549'
'7642': '144550'
'7643': '144551'
'7644': '144552'
'7645': '144587'
'7646': '144592'
'7647': '144600'
'7648': '144733'
'7649': '144740'
'7650': '144741'
'7651': '144801'
'7652': '144809'
'7653': '144810'
'7654': '144933'
'7655': '144934'
'7656': '144935'
'7657': '144936'
'7658': '144937'
'7659': '144938'
'7660': '144939'
'7661': '144940'
'7662': '144941'
'7663': '144942'
'7664': '144943'
'7665': '144944'
'7666': '144945'
'7667': '144946'
'7668': '145002'
'7669': '145003'
'7670': '145004'
'7671': '145005'
'7672': '145020'
'7673': '145027'
'7674': '145041'
'7675': '145042'
'7676': '145043'
'7677': '145058'
'7678': '145059'
'7679': '145067'
'7680': '145068'
'7681': '145074'
'7682': '145183'
'7683': '145189'
'7684': '145199'
'7685': '145241'
'7686': '145257'
'7687': '145258'
'7688': '145259'
'7689': '145260'
'7690': '145431'
'7691': '145432'
'7692': '145457'
'7693': '145458'
'7694': '145462'
'7695': '145464'
'7696': '145475'
'7697': '145476'
'7698': '145477'
'7699': '145549'
'7700': '145550'
'7701': '145551'
'7702': '145552'
'7703': '145553'
'7704': '145554'
'7705': '145555'
'7706': '145556'
'7707': '145606'
'7708': '145607'
'7709': '145608'
'7710': '145609'
'7711': '145610'
'7712': '145645'
'7713': '145646'
'7714': '145653'
'7715': '145702'
'7716': '145703'
'7717': '145704'
'7718': '145705'
'7719': '145706'
'7720': '145707'
'7721': '145708'
'7722': '145709'
'7723': '145710'
'7724': '145711'
'7725': '145724'
'7726': '145727'
'7727': '145728'
'7728': '145729'
'7729': '145730'
'7730': '145741'
'7731': '145742'
'7732': '145743'
'7733': '145744'
'7734': '145745'
'7735': '145746'
'7736': '145747'
'7737': '145748'
'7738': '145749'
'7739': '145750'
'7740': '145751'
'7741': '145752'
'7742': '145754'
'7743': '145755'
'7744': '145756'
'7745': '145757'
'7746': '145758'
'7747': '145759'
'7748': '145760'
'7749': '145761'
'7750': '145762'
'7751': '145777'
'7752': '145780'
'7753': '145783'
'7754': '145887'
'7755': '145917'
'7756': '145918'
'7757': '146017'
'7758': '146018'
'7759': '146019'
'7760': '146020'
'7761': '146070'
'7762': '146147'
'7763': '146148'
'7764': '146149'
'7765': '146150'
'7766': '146151'
'7767': '146152'
'7768': '146153'
'7769': '146343'
'7770': '146458'
'7771': '146478'
'7772': '146481'
'7773': '146482'
'7774': '146483'
'7775': '146639'
'7776': '146681'
'7777': '146683'
'7778': '146685'
'7779': '146687'
'7780': '146689'
'7781': '146713'
'7782': '146716'
'7783': '146724'
'7784': '146725'
'7785': '146726'
'7786': '146727'
'7787': '146879'
'7788': '146961'
'7789': '146968'
'7790': '146969'
'7791': '146970'
'7792': '146988'
'7793': '146989'
'7794': '147020'
'7795': '147021'
'7796': '147022'
'7797': '147023'
'7798': '147024'
'7799': '147059'
'7800': '147085'
'7801': '147086'
'7802': '147087'
'7803': '147126'
'7804': '147191'
'7805': '147261'
'7806': '147265'
'7807': '147267'
'7808': '147268'
'7809': '147269'
'7810': '147295'
'7811': '147309'
'7812': '147409'
'7813': '147412'
'7814': '147413'
'7815': '147780'
'7816': '147815'
'7817': '147886'
'7818': '147956'
'7819': '148002'
'7820': '148028'
'7821': '148031'
'7822': '148032'
'7823': '148066'
'7824': '148070'
'7825': '148074'
'7826': '148075'
'7827': '148076'
'7828': '148077'
'7829': '148078'
'7830': '148079'
'7831': '148082'
'7832': '148099'
'7833': '148112'
'7834': '148113'
'7835': '148114'
'7836': '148120'
'7837': '148121'
'7838': '148124'
'7839': '148130'
'7840': '148131'
'7841': '148132'
'7842': '148133'
'7843': '148168'
'7844': '148186'
'7845': '148187'
'7846': '148190'
'7847': '148208'
'7848': '148210'
'7849': '148211'
'7850': '148212'
'7851': '148213'
'7852': '148214'
'7853': '148215'
'7854': '148216'
'7855': '148217'
'7856': '148218'
'7857': '148231'
'7858': '148233'
'7859': '148234'
'7860': '148235'
'7861': '148246'
'7862': '148285'
'7863': '148286'
'7864': '148287'
'7865': '148288'
'7866': '148289'
'7867': '148290'
'7868': '148302'
'7869': '148303'
'7870': '148305'
'7871': '148429'
'7872': '148430'
'7873': '148439'
'7874': '148441'
'7875': '148443'
'7876': '148444'
'7877': '148510'
'7878': '148513'
'7879': '148514'
'7880': '148516'
'7881': '148517'
'7882': '148518'
'7883': '148519'
'7884': '148532'
'7885': '148535'
'7886': '148536'
'7887': '148537'
'7888': '148584'
'7889': '148585'
'7890': '148586'
'7891': '148587'
'7892': '148602'
'7893': '148603'
'7894': '148604'
'7895': '148605'
'7896': '148606'
'7897': '148607'
'7898': '148608'
'7899': '148609'
'7900': '148610'
'7901': '148611'
'7902': '148612'
'7903': '148613'
'7904': '148773'
'7905': '149075'
'7906': '149078'
'7907': '149082'
'7908': '149083'
'7909': '149099'
'7910': '149100'
'7911': '149101'
'7912': '149102'
'7913': '149103'
'7914': '149118'
'7915': '149124'
'7916': '149138'
'7917': '149139'
'7918': '149140'
'7919': '149141'
'7920': '149142'
'7921': '149143'
'7922': '149185'
'7923': '149369'
'7924': '149370'
'7925': '149416'
'7926': '149417'
'7927': '149422'
'7928': '149452'
'7929': '149488'
'7930': '149523'
'7931': '149623'
'7932': '149625'
'7933': '149626'
'7934': '149687'
'7935': '149689'
'7936': '149690'
'7937': '149700'
'7938': '149701'
'7939': '149712'
'7940': '149714'
'7941': '149727'
'7942': '149750'
'7943': '149775'
'7944': '149776'
'7945': '149777'
'7946': '149778'
'7947': '149842'
'7948': '149951'
'7949': '149953'
'7950': '150015'
'7951': '150017'
'7952': '150018'
'7953': '150062'
'7954': '150063'
'7955': '150064'
'7956': '150073'
'7957': '150078'
'7958': '150079'
'7959': '150080'
'7960': '150265'
'7961': '150266'
'7962': '150267'
'7963': '150268'
'7964': '150287'
'7965': '150288'
'7966': '151404'
'7967': '152103'
'7968': '152253'
'7969': '152254'
'7970': '152258'
'7971': '152261'
'7972': '152262'
'7973': '152324'
'7974': '152418'
'7975': '152425'
'7976': '152480'
'7977': '152543'
'7978': '152545'
'7979': '152568'
'7980': '152569'
'7981': '152570'
'7982': '153337'
'7983': '153383'
'7984': '153452'
'7985': '153946'
'7986': '153955'
'7987': '153956'
'7988': '154303'
'7989': '154305'
'7990': '154306'
'7991': '154307'
'7992': '154308'
'7993': '154309'
'7994': '154413'
'7995': '154414'
'7996': '155066'
- name: label
dtype: int64
splits:
- name: train
num_bytes: 2137269729.0
num_examples: 39985
download_size: 2117712815
dataset_size: 2137269729.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
betogaunt/acredita.zip | ---
license: openrail
---
|
ConvLab/dailydialog | ---
language:
- en
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
pretty_name: DailyDialog
size_categories:
- 10K<n<100K
task_categories:
- conversational
---
# Dataset Card for DailyDialog
- **Repository:** http://yanran.li/dailydialog
- **Paper:** https://arxiv.org/pdf/1710.03957.pdf
- **Leaderboard:** None
- **Who transforms the dataset:** Qi Zhu(zhuq96 at gmail dot com)
To use this dataset, you need to install [ConvLab-3](https://github.com/ConvLab/ConvLab-3) platform first. Then you can load the dataset via:
```
from convlab.util import load_dataset, load_ontology, load_database
dataset = load_dataset('dailydialog')
ontology = load_ontology('dailydialog')
database = load_database('dailydialog')
```
For more usage please refer to [here](https://github.com/ConvLab/ConvLab-3/tree/master/data/unified_datasets).
### Dataset Summary
DailyDialog is a high-quality multi-turn dialog dataset. It is intriguing in several aspects. The language is human-written and less noisy. The dialogues in the dataset reflect our daily communication way and cover various topics about our daily life. We also manually label the developed dataset with communication intention and emotion information.
- **How to get the transformed data from original data:**
- Download [ijcnlp_dailydialog.zip](http://yanran.li/files/ijcnlp_dailydialog.zip).
- Run `python preprocess.py` in the current directory.
- **Main changes of the transformation:**
- Use `topic` annotation as `domain`. If duplicated dialogs are annotated with different topics, use the most frequent one.
- Use `intent` annotation as `binary` dialogue act.
- Retain emotion annotation in the `emotion` field of each turn.
- Use nltk to remove space before punctuation: `utt = ' '.join([detokenizer.detokenize(word_tokenize(s)) for s in sent_tokenize(utt)])`.
- Replace `" ’ "` with `"'"`: `utt = utt.replace(' ’ ', "'")`.
- Add space after full-stop
- **Annotations:**
- intent, emotion
### Supported Tasks and Leaderboards
NLU, NLG
### Languages
English
### Data Splits
| split | dialogues | utterances | avg_utt | avg_tokens | avg_domains | cat slot match(state) | cat slot match(goal) | cat slot match(dialogue act) | non-cat slot span(dialogue act) |
|------------|-------------|--------------|-----------|--------------|---------------|-------------------------|------------------------|--------------------------------|-----------------------------------|
| train | 11118 | 87170 | 7.84 | 11.22 | 1 | - | - | - | - |
| validation | 1000 | 8069 | 8.07 | 11.16 | 1 | - | - | - | - |
| test | 1000 | 7740 | 7.74 | 11.36 | 1 | - | - | - | - |
| all | 13118 | 102979 | 7.85 | 11.22 | 1 | - | - | - | - |
10 domains: ['Ordinary Life', 'School Life', 'Culture & Education', 'Attitude & Emotion', 'Relationship', 'Tourism', 'Health', 'Work', 'Politics', 'Finance']
- **cat slot match**: how many values of categorical slots are in the possible values of ontology in percentage.
- **non-cat slot span**: how many values of non-categorical slots have span annotation in percentage.
### Citation
```
@InProceedings{li2017dailydialog,
author = {Li, Yanran and Su, Hui and Shen, Xiaoyu and Li, Wenjie and Cao, Ziqiang and Niu, Shuzi},
title = {DailyDialog: A Manually Labelled Multi-turn Dialogue Dataset},
booktitle = {Proceedings of The 8th International Joint Conference on Natural Language Processing (IJCNLP 2017)},
year = {2017}
}
```
### Licensing Information
[**CC BY-NC-SA 4.0**](https://creativecommons.org/licenses/by-nc-sa/4.0/) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/7ea9ec89 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1325
dataset_size: 184
---
# Dataset Card for "7ea9ec89"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
teticio/audio-diffusion-breaks-256 | ---
annotations_creators: []
language: []
language_creators: []
license: []
multilinguality: []
pretty_name: Mel spectrograms of sampled music
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- audio
- spectrograms
task_categories:
- image-to-image
task_ids: []
---
30,000 256x256 mel spectrograms of 5 second samples that have been used in music, sourced from [WhoSampled](https://whosampled.com) and [YouTube](https://youtube.com). The code to convert from audio to spectrogram and vice versa can be found in https://github.com/teticio/audio-diffusion along with scripts to train and run inference using De-noising Diffusion Probabilistic Models.
```
x_res = 256
y_res = 256
sample_rate = 22050
n_fft = 2048
hop_length = 512
``` |
ayoub999/factures_generales | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': Ref
'2': NumFa
'3': Fourniss
'4': DateFa
'5': DateLim
'6': TotalHT
'7': TVA
'8': TotalTTc
'9': unitP
'10': Qt
'11': TVAP
'12': Désignation
'13': Adresse
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 1242002.0
num_examples: 4
- name: test
num_bytes: 621001.0
num_examples: 2
download_size: 1794949
dataset_size: 1863003.0
---
# Dataset Card for "factures_generales"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf | ---
pretty_name: Evaluation run of openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T23:15:18.463104](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf/blob/main/results_2023-09-22T23-15-18.463104.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.031774328859060404,\n\
\ \"em_stderr\": 0.0017962473521312393,\n \"f1\": 0.08420092281879202,\n\
\ \"f1_stderr\": 0.0021474530604162255,\n \"acc\": 0.3646366953032391,\n\
\ \"acc_stderr\": 0.00915095624646051\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.031774328859060404,\n \"em_stderr\": 0.0017962473521312393,\n\
\ \"f1\": 0.08420092281879202,\n \"f1_stderr\": 0.0021474530604162255\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03866565579984837,\n \
\ \"acc_stderr\": 0.005310583162098024\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6906077348066298,\n \"acc_stderr\": 0.012991329330822995\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|arc:challenge|25_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T23_15_18.463104
path:
- '**/details_harness|drop|3_2023-09-22T23-15-18.463104.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T23-15-18.463104.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T23_15_18.463104
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-15-18.463104.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-15-18.463104.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hellaswag|10_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T12:43:45.904593.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T12:43:45.904593.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T23_15_18.463104
path:
- '**/details_harness|winogrande|5_2023-09-22T23-15-18.463104.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T23-15-18.463104.parquet'
- config_name: results
data_files:
- split: 2023_08_18T12_43_45.904593
path:
- results_2023-08-18T12:43:45.904593.parquet
- split: 2023_09_22T23_15_18.463104
path:
- results_2023-09-22T23-15-18.463104.parquet
- split: latest
path:
- results_2023-09-22T23-15-18.463104.parquet
---
# Dataset Card for Evaluation run of openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T23:15:18.463104](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf/blob/main/results_2023-09-22T23-15-18.463104.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.031774328859060404,
"em_stderr": 0.0017962473521312393,
"f1": 0.08420092281879202,
"f1_stderr": 0.0021474530604162255,
"acc": 0.3646366953032391,
"acc_stderr": 0.00915095624646051
},
"harness|drop|3": {
"em": 0.031774328859060404,
"em_stderr": 0.0017962473521312393,
"f1": 0.08420092281879202,
"f1_stderr": 0.0021474530604162255
},
"harness|gsm8k|5": {
"acc": 0.03866565579984837,
"acc_stderr": 0.005310583162098024
},
"harness|winogrande|5": {
"acc": 0.6906077348066298,
"acc_stderr": 0.012991329330822995
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Musha-the-Yusha/mushi-snli-llama2-grammar_struct-10k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3456483
num_examples: 10000
download_size: 1106306
dataset_size: 3456483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mushi-snli-llama2-grammar_struct-10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
boda/cryptonite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: publisher
dtype: string
- name: date
dtype: timestamp[ns]
- name: author
dtype: string
- name: orientation
dtype: string
- name: clue
dtype: string
- name: answer
dtype: string
- name: enumeration
dtype: string
- name: quick
dtype: bool
- name: sub_publisher
dtype: string
splits:
- name: train
num_bytes: 51949570
num_examples: 470804
- name: val
num_bytes: 2886129
num_examples: 26156
- name: test
num_bytes: 2891443
num_examples: 26157
download_size: 26277347
dataset_size: 57727142
---
# Dataset Card for "cryptonite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dattatreya303/covid-qa-tts | ---
license: mit
---
---
annotations_creators:
- found
language:
- en
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
pretty_name: covid-qa-tts
size_categories:
- 1K<n<10K
source_datasets:
- extended|covid_qa_deepset
tags: []
task_categories:
- question-answering
task_ids:
- closed-domain-qa
---
# Dataset Card for covid-qa-tts
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
evoeval/EvoEval_difficult | ---
license: apache-2.0
language:
- en
tags:
- code
--- |
RevEng-23-24/Dataset360K | ---
dataset_info:
features:
- name: assembly
dtype: string
- name: c_source_code
dtype: string
splits:
- name: train
num_bytes: 453179468
num_examples: 229251
- name: val
num_bytes: 112995651
num_examples: 57313
- name: test
num_bytes: 141677299
num_examples: 71642
download_size: 184865673
dataset_size: 707852418
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
CyberHarem/kawakaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kawakaze/江風 (Kantai Collection)
This is the dataset of kawakaze/江風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, red_hair, hairband, ahoge, ribbon, hair_ribbon, twintails, bangs, very_long_hair, low_twintails, sidelocks, asymmetrical_bangs, blue_eyes, braid, twin_braids, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 465.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawakaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 315.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawakaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1144 | 651.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawakaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 432.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawakaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1144 | 829.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kawakaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kawakaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, black_gloves, black_skirt, elbow_gloves, fingerless_gloves, looking_at_viewer, neckerchief, pleated_skirt, serafuku, sleeveless_shirt, solo, belt, smile, navel, blush, collared_shirt, simple_background, white_background, bare_shoulders, black_thighhighs, open_mouth |
| 1 | 6 |  |  |  |  |  | 1girl, black_gloves, black_serafuku, black_skirt, collared_shirt, elbow_gloves, navel, one-hour_drawing_challenge, pleated_skirt, simple_background, sleeveless_shirt, solo, white_background, blue_neckerchief, fingerless_gloves, twitter_username, black_thighhighs, cowboy_shot, looking_at_viewer, white_belt, dated, smile |
| 2 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, sleeveless_shirt, solo, collared_shirt, elbow_gloves, upper_body, black_gloves, blue_neckerchief, bare_shoulders, fingerless_gloves, blush, navel, simple_background, grin |
| 3 | 15 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, hair_flaps, scarf, serafuku, solo, looking_at_viewer, smile, cape, elbow_gloves, neckerchief, open_mouth, torpedo, black_skirt, machinery, pleated_skirt, turret |
| 4 | 6 |  |  |  |  |  | 1girl, cape, hair_flaps, serafuku, solo, chibi, fang, open_mouth, white_scarf, :d, ^_^, neckerchief, pleated_skirt, thighhighs, elbow_gloves, fingerless_gloves |
| 5 | 12 |  |  |  |  |  | 1girl, hair_flaps, solo, alternate_costume, employee_uniform, looking_at_viewer, pleated_skirt, black_skirt, smile, vertical-striped_shirt, cowboy_shot, open_mouth, name_tag, red_ribbon, simple_background |
| 6 | 5 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, simple_background, solo, blush, cowboy_shot, long_sleeves, twitter_username, white_background, hand_on_hip, one-hour_drawing_challenge, pleated_skirt, red_ribbon, school_uniform, smile, white_shirt, bowtie, closed_mouth, collared_shirt, cropped_legs, lips, pointy_ears |
| 7 | 18 |  |  |  |  |  | 1girl, solo, looking_at_viewer, adapted_costume, sailor_bikini, smile, black_bikini, simple_background, blush, white_background, navel, small_breasts, hair_flaps, medium_breasts |
| 8 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_background, alternate_costume, simple_background, artist_logo, blush, cowboy_shot, dated, dress, collarbone, grin, pointy_ears |
| 9 | 12 |  |  |  |  |  | 1girl, yukata, solo, alternate_costume, looking_at_viewer, floral_print, obi, smile, blush, candy_apple, fox_mask, mask_on_head, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_skirt | elbow_gloves | fingerless_gloves | looking_at_viewer | neckerchief | pleated_skirt | serafuku | sleeveless_shirt | solo | belt | smile | navel | blush | collared_shirt | simple_background | white_background | bare_shoulders | black_thighhighs | open_mouth | black_serafuku | one-hour_drawing_challenge | blue_neckerchief | twitter_username | cowboy_shot | white_belt | dated | upper_body | grin | hair_flaps | scarf | cape | torpedo | machinery | turret | chibi | fang | white_scarf | :d | ^_^ | thighhighs | alternate_costume | employee_uniform | vertical-striped_shirt | name_tag | red_ribbon | long_sleeves | hand_on_hip | school_uniform | white_shirt | bowtie | closed_mouth | cropped_legs | lips | pointy_ears | adapted_costume | sailor_bikini | black_bikini | small_breasts | medium_breasts | artist_logo | dress | collarbone | yukata | floral_print | obi | candy_apple | fox_mask | mask_on_head |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------|:---------------|:--------------------|:--------------------|:--------------|:----------------|:-----------|:-------------------|:-------|:-------|:--------|:--------|:--------|:-----------------|:--------------------|:-------------------|:-----------------|:-------------------|:-------------|:-----------------|:-----------------------------|:-------------------|:-------------------|:--------------|:-------------|:--------|:-------------|:-------|:-------------|:--------|:-------|:----------|:------------|:---------|:--------|:-------|:--------------|:-----|:------|:-------------|:--------------------|:-------------------|:-------------------------|:-----------|:-------------|:---------------|:--------------|:-----------------|:--------------|:---------|:---------------|:---------------|:-------|:--------------|:------------------|:----------------|:---------------|:----------------|:-----------------|:--------------|:--------|:-------------|:---------|:---------------|:------|:--------------|:-----------|:---------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | X | | X | X | | X | X | | X | X | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | | X | X | X | | | X | X | X | | | X | X | X | X | | X | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | X | X | | X | X | X | | X | | | | | | | | | | X | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | X | | | X | | X | | | X | | X | | | | X | | | | X | | | | | X | | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | X | | X | | | X | | X | | X | X | X | X | | | | | X | | X | X | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 7 | 18 |  |  |  |  |  | X | | | | | X | | | | | X | | X | X | X | | X | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | | | | | X | | | | | X | | | | X | | X | X | | | | | | | | X | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | X | X | X | | | | | | |
| 9 | 12 |  |  |  |  |  | X | | | | | X | | | | | X | | X | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
YUiCHl/scale512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 172122937.0
num_examples: 1588
download_size: 171688682
dataset_size: 172122937.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arturk0804/surgeBS | ---
license: openrail
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo2_100_kl_0.1_prm_410m_thr_0.3_seed_1 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43630616
num_examples: 18929
- name: epoch_1
num_bytes: 43875800
num_examples: 18929
- name: epoch_2
num_bytes: 43837484
num_examples: 18929
- name: epoch_3
num_bytes: 43770380
num_examples: 18929
- name: epoch_4
num_bytes: 43752770
num_examples: 18929
- name: epoch_5
num_bytes: 43723463
num_examples: 18929
- name: epoch_6
num_bytes: 43701133
num_examples: 18929
- name: epoch_7
num_bytes: 43698431
num_examples: 18929
- name: epoch_8
num_bytes: 43687184
num_examples: 18929
- name: epoch_9
num_bytes: 43680253
num_examples: 18929
download_size: 232170131
dataset_size: 437357514
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-*
---
|
lexshinobi/lexshinobi | ---
license: openrail
---
|
Seanxh/twitter_dataset_1713083696 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26044
num_examples: 61
download_size: 13451
dataset_size: 26044
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/squad_qa_rare_v5_full_recite_full_passage_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8766940.477555942
num_examples: 4778
- name: validation
num_bytes: 582950
num_examples: 300
download_size: 1746544
dataset_size: 9349890.477555942
---
# Dataset Card for "squad_qa_rare_v5_full_recite_full_passage_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16 | ---
pretty_name: Evaluation run of TheBloke/Vicuna-13B-CoT-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Vicuna-13B-CoT-fp16](https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T14:12:38.922029](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16/blob/main/results_2023-10-22T14-12-38.922029.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029677013422818792,\n\
\ \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n\
\ \"f1_stderr\": 0.002167792401176146,\n \"acc\": 0.4141695683211732,\n\
\ \"acc_stderr\": 0.010019161585538096\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n\
\ \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.00774004433710381\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|arc:challenge|25_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T14_12_38.922029
path:
- '**/details_harness|drop|3_2023-10-22T14-12-38.922029.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T14-12-38.922029.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T14_12_38.922029
path:
- '**/details_harness|gsm8k|5_2023-10-22T14-12-38.922029.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T14-12-38.922029.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hellaswag|10_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T15:25:40.141748.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T15:25:40.141748.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T14_12_38.922029
path:
- '**/details_harness|winogrande|5_2023-10-22T14-12-38.922029.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T14-12-38.922029.parquet'
- config_name: results
data_files:
- split: 2023_07_31T15_25_40.141748
path:
- results_2023-07-31T15:25:40.141748.parquet
- split: 2023_10_22T14_12_38.922029
path:
- results_2023-10-22T14-12-38.922029.parquet
- split: latest
path:
- results_2023-10-22T14-12-38.922029.parquet
---
# Dataset Card for Evaluation run of TheBloke/Vicuna-13B-CoT-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Vicuna-13B-CoT-fp16](https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T14:12:38.922029](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16/blob/main/results_2023-10-22T14-12-38.922029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146,
"acc": 0.4141695683211732,
"acc_stderr": 0.010019161585538096
},
"harness|drop|3": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.00774004433710381
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972384
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Nielzac/CoM_Audio_Image_LLM_Generation | ---
license: mit
---

## This dataset is a Mixture of DIBT/10k_prompts_ranked, lj_speech and Falah/image_generation_prompts_SDXL
### Repartition

### Why this dataset ?
Training a multimodal router holds crucial significance in the realm of artificial intelligence. By harmonizing different specialized models within a constellation, the router plays a central role in intelligently orchestrating tasks. This approach not only enables precise classification but also paves the way for diverse applications of artificial intelligence, thereby enhancing our ability to successfully navigate through the complexities of multimodal data. In essence, training a multimodal router represents a vital strategic advancement, unveiling exciting new prospects for the future of AI. |
davanstrien/testmodelcardwdata | ---
dataset_info:
features:
- name: modelId
dtype: string
- name: sha
dtype: 'null'
- name: lastModified
dtype: 'null'
- name: pipeline_tag
dtype: string
- name: author
dtype: 'null'
- name: securityStatus
dtype: 'null'
- name: likes
dtype: int64
- name: downloads
dtype: int64
- name: dataset
sequence: string
- name: arxiv
sequence: string
- name: license
sequence: string
- name: tags
sequence: string
- name: doi
sequence: string
- name: card
dtype: string
splits:
- name: train
num_bytes: 541057
num_examples: 100
download_size: 163196
dataset_size: 541057
---
# Dataset Card for "testmodelcardwdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__feed-top_vi_-7f787f-2245771645 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-7b1
metrics: []
dataset_name: futin/feed
dataset_config: top_vi_
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-7b1
* Dataset: futin/feed
* Config: top_vi_
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
CyberHarem/haguro_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of haguro/羽黒/羽黒 (Kantai Collection)
This is the dataset of haguro/羽黒/羽黒 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `black_hair, short_hair, hair_ornament, brown_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 512.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 323.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1129 | 667.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 464.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1129 | 896.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haguro_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/haguro_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, black_skirt, looking_at_viewer, military_uniform, pencil_skirt, solo, white_gloves, purple_jacket, smile, alternate_legwear, open_mouth, simple_background, white_background, shirt, white_thighhighs, twitter_username, black_belt, cowboy_shot, juliet_sleeves |
| 1 | 28 |  |  |  |  |  | 1girl, black_skirt, solo, white_gloves, military_uniform, pencil_skirt, white_pantyhose, looking_at_viewer, long_sleeves, simple_background, open_mouth, white_background, belt, purple_jacket, smile, blush |
| 2 | 11 |  |  |  |  |  | 1girl, long_sleeves, military_uniform, purple_jacket, solo, upper_body, white_gloves, simple_background, white_background, looking_at_viewer, shirt, open_mouth, smile |
| 3 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, military_uniform, purple_jacket, solo, upper_body, white_background, shirt, simple_background, smile, black_eyes, blush, dated, large_breasts, long_sleeves |
| 4 | 8 |  |  |  |  |  | 1girl, solo, white_gloves, blush, open_mouth, tears, smile |
| 5 | 9 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, torn_pantyhose, white_gloves, elbow_gloves, medium_breasts, white_pantyhose, tears, sitting, skirt, open_mouth |
| 6 | 6 |  |  |  |  |  | 1girl, alternate_costume, blush, long_sleeves, looking_at_viewer, simple_background, solo, white_background, large_breasts, twitter_username, black_dress, black_eyes, habit, nun, open_mouth, cowboy_shot, hair_between_eyes, holding_book |
| 7 | 5 |  |  |  |  |  | 1girl, bikini, looking_at_viewer, solo, blush, large_breasts, simple_background, white_background, cowboy_shot, navel, cleavage, leaning_forward, open_mouth |
| 8 | 7 |  |  |  |  |  | 1girl, alternate_costume, obi, solo, looking_at_viewer, hair_between_eyes, purple_kimono, smile, black_eyes, blue_kimono, blush, dated, floral_print, open_mouth, upper_body, yukata |
| 9 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, large_breasts, navel, nipples, solo_focus, cowgirl_position, girl_on_top, nude, open_mouth, sweat, vaginal, censored, dark-skinned_male, female_pubic_hair, happy_sex, penis, smile, bouncing_breasts, collarbone, heart, medium_breasts, nose_blush, pov, wedding_ring |
| 10 | 5 |  |  |  |  |  | 1girl, blue_sky, cowboy_shot, day, looking_at_viewer, outdoors, solo, cleavage, cloud, large_breasts, navel, ocean, black_bikini, blush, collarbone, hair_between_eyes, medium_breasts, smile, beach, black_eyes, open_clothes, open_mouth, tree, white_jacket, white_shirt |
| 11 | 8 |  |  |  |  |  | detached_collar, playboy_bunny, rabbit_ears, strapless_leotard, 1girl, cleavage, fake_animal_ears, solo, wrist_cuffs, rabbit_tail, simple_background, white_background, black_bowtie, blush, cowboy_shot, large_breasts, looking_at_viewer, medium_breasts, black_leotard, purple_leotard, embarrassed, open_mouth, tears, white_gloves |
| 12 | 5 |  |  |  |  |  | 1girl, black_neckerchief, black_panties, blue_skirt, blush, crop_top, elbow_gloves, shimakaze_(kancolle)_(cosplay), solo, white_gloves, blue_sailor_collar, cowboy_shot, highleg_panties, microskirt, miniskirt, pleated_skirt, striped_thighhighs, black_hairband, large_breasts, looking_at_viewer, thong, embarrassed, navel, open_mouth, serafuku |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | looking_at_viewer | military_uniform | pencil_skirt | solo | white_gloves | purple_jacket | smile | alternate_legwear | open_mouth | simple_background | white_background | shirt | white_thighhighs | twitter_username | black_belt | cowboy_shot | juliet_sleeves | white_pantyhose | long_sleeves | belt | blush | upper_body | black_eyes | dated | large_breasts | tears | torn_pantyhose | elbow_gloves | medium_breasts | sitting | skirt | alternate_costume | black_dress | habit | nun | hair_between_eyes | holding_book | bikini | navel | cleavage | leaning_forward | obi | purple_kimono | blue_kimono | floral_print | yukata | 1boy | hetero | nipples | solo_focus | cowgirl_position | girl_on_top | nude | sweat | vaginal | censored | dark-skinned_male | female_pubic_hair | happy_sex | penis | bouncing_breasts | collarbone | heart | nose_blush | pov | wedding_ring | blue_sky | day | outdoors | cloud | ocean | black_bikini | beach | open_clothes | tree | white_jacket | white_shirt | detached_collar | playboy_bunny | rabbit_ears | strapless_leotard | fake_animal_ears | wrist_cuffs | rabbit_tail | black_bowtie | black_leotard | purple_leotard | embarrassed | black_neckerchief | black_panties | blue_skirt | crop_top | shimakaze_(kancolle)_(cosplay) | blue_sailor_collar | highleg_panties | microskirt | miniskirt | pleated_skirt | striped_thighhighs | black_hairband | thong | serafuku |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------|:--------------------|:-------------------|:---------------|:-------|:---------------|:----------------|:--------|:--------------------|:-------------|:--------------------|:-------------------|:--------|:-------------------|:-------------------|:-------------|:--------------|:-----------------|:------------------|:---------------|:-------|:--------|:-------------|:-------------|:--------|:----------------|:--------|:-----------------|:---------------|:-----------------|:----------|:--------|:--------------------|:--------------|:--------|:------|:--------------------|:---------------|:---------|:--------|:-----------|:------------------|:------|:----------------|:--------------|:---------------|:---------|:-------|:---------|:----------|:-------------|:-------------------|:--------------|:-------|:--------|:----------|:-----------|:--------------------|:--------------------|:------------|:--------|:-------------------|:-------------|:--------|:-------------|:------|:---------------|:-----------|:------|:-----------|:--------|:--------|:---------------|:--------|:---------------|:-------|:---------------|:--------------|:------------------|:----------------|:--------------|:--------------------|:-------------------|:--------------|:--------------|:---------------|:----------------|:-----------------|:--------------|:--------------------|:----------------|:-------------|:-----------|:---------------------------------|:---------------------|:------------------|:-------------|:------------|:----------------|:---------------------|:-----------------|:--------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | X | X | | X | X | X | X | | X | X | X | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | X | | X | | X | X | | | X | X | X | | | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | | | | X | X | | X | | X | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | | | X | X | | | | X | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | | | X | | | | | X | X | X | | | X | | X | | | X | | X | | X | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | | | X | | | | | X | X | X | | | | | X | | | | | X | | | | X | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | X | | | X | | | X | | X | | | | | | | | | | | | X | X | X | X | | | | | | | | X | | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | | X | | | X | | | X | | X | | | | | | | X | | | | | X | | X | | X | | | | X | | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 8 |  |  |  |  |  | X | | X | | | X | X | | | | X | X | X | | | | | X | | | | | X | | | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | X | | | X | X | | | | X | | | | | | | X | | | | | X | | | | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
andersonbcdefg/sft_code_submix | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1399630171.2578096
num_examples: 1146098
download_size: 501178967
dataset_size: 1399630171.2578096
---
# Dataset Card for "sft_code_submix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/celeba-spoof-dataset | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
- image-classification
- image-to-video
language:
- en
tags:
- code
- finance
- legal
- webdataset
---
# Biometric Attack Dataset
# The dataset is created on the basis of [Anti Spoofing Real Dataset](https://trainingdata.pro/data-market/anti-spoofing-real/?utm_source=huggingface&utm_medium=cpc&utm_campaign=celebA)
We suggest you the dataset similar to CelebA Dataset but with photos of **real people**, additionally the dataset for face anti spoofing and face recognition includes not only images, but videos of the individuals!
The videos were gathered by capturing faces of genuine individuals presenting spoofs, using facial presentations. Our dataset proposes a novel approach that learns and detects spoofing techniques, extracting features from the genuine facial images to prevent the capturing of such information by fake users.
The dataset contains images and videos of real humans with various **resolutions, views, and colors**, making it a comprehensive resource for researchers working on anti-spoofing technologies.
### People in the dataset
.png?generation=1707303383706326&alt=media)
### Types of files in the dataset:
- **photo** - selfie of the person
- **video** - real video of the person
Our dataset also explores the use of neural architectures, such as deep neural networks, to facilitate the identification of distinguishing patterns and textures in different regions of the face, increasing the accuracy and generalizability of the anti-spoofing models.
# 💴 For Commercial Usage: Full version of the dataset includes 98,000 files, leave a request on **[TrainingData](https://trainingdata.pro/data-market/anti-spoofing-real/?utm_source=huggingface&utm_medium=cpc&utm_campaign=celebA)** to buy the dataset
### Metadata for the full dataset:
- **assignment_id** - unique identifier of the media file
- **worker_id** - unique identifier of the person
- **age** - age of the person
- **true_gender** - gender of the person
- **country** - country of the person
- **ethnicity** - ethnicity of the person
- **video_extension** - video extensions in the dataset
- **video_resolution** - video resolution in the dataset
- **video_duration** - video duration in the dataset
- **video_fps** - frames per second for video in the dataset
- **photo_extension** - photo extensions in the dataset
- **photo_resolution** - photo resolution in the dataset

# 💴 Buy the Dataset: This is just an example of the data. Leave a request on **[https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/anti-spoofing-real/?utm_source=huggingface&utm_medium=cpc&utm_campaign=celebA) to learn about the price and buy the dataset**
# Content
The dataset includes 2 folders:
- **video** - real videos of the people,
- **photo** - selfies of the same people from the previous folder
### File with the extension .csv
- **id**: id of the person,
- **photo**: link to access the photo,
- **video**: link to access the video
## **[TrainingData](https://trainingdata.pro/data-market/anti-spoofing-real/?utm_source=huggingface&utm_medium=cpc&utm_campaign=celebA)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
*keywords: liveness detection systems, liveness detection dataset, biometric dataset, biometric data dataset, biometric system attacks, anti-spoofing dataset, face liveness detection, deep learning dataset, face spoofing database, face anti-spoofing, ibeta dataset, face anti spoofing, large-scale face anti spoofing, rich annotations anti spoofing dataset* |
julien040/hacker-news-posts | ---
license: cc-by-nc-sa-4.0
language:
- en
tags:
- hacker news
pretty_name: Hacker News stories dataset
size_categories:
- 1M<n<10M
source_datasets:
- Hacker News
---
# Hacker News Stories Dataset
This is a dataset containing approximately 4 million stories from Hacker News, exported to a CSV file. The dataset includes the following fields:
- `id` (int64): The unique identifier of the story.
- `title` (string): The title of the story.
- `url` (string): The URL of the story.
- `score` (int64): The score of the story.
- `time` (int64): The time the story was posted, in Unix time.
- `comments` (int64): The number of comments on the story.
- `author` (string): The username of the person who posted the story.
## Accessing the Dataset
The dataset can be accessed through [Hugging Face Datasets](https://huggingface.co/datasets/julien040/hacker-news-posts). You can download the dataset in CSV format or use the Hugging Face Datasets library to load the dataset directly in your Python code.
## License
The dataset is made available under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-nc-sa/4.0/).
## Disclaimer
The dataset is provided as is, without warranty of any kind, express or implied. The owner of the dataset makes no representations or warranties, express or implied, regarding the dataset or its use. The owner of the dataset will not be liable for any damages arising out of or in connection with the use of the dataset.
## Updates
The dataset will be updated regularly to include new stories from Hacker News. |
sbenel/yanuq_it | ---
license: apache-2.0
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: messages
sequence: 'null'
- name: generation_model
sequence: string
- name: generation_prompt
sequence: string
- name: raw_generation_responses
sequence: string
- name: conversation
sequence:
sequence: string
splits:
- name: train
num_bytes: 2615126
num_examples: 832
download_size: 1251249
dataset_size: 2615126
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rishiai/indian-court-judgements-and-its-summaries | ---
license: apache-2.0
---
|
result-kand2-sdxl-wuerst-karlo/8e18a25b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 191
num_examples: 10
download_size: 1358
dataset_size: 191
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "8e18a25b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
modelloosrvcc/Peppa | ---
license: openrail
---
|
mfidabel/sam-coyo-3k | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2194166543.838
num_examples: 3113
download_size: 2199147437
dataset_size: 2194166543.838
---
# Dataset Card for "sam-coyo-3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlexAmin/consent-chat | ---
license: mit
---
|
jlbaker361/spider-300 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: frame
dtype: int64
splits:
- name: train
num_bytes: 1754907838.0
num_examples: 400
download_size: 1754980818
dataset_size: 1754907838.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sayalaruano/FakeNewsSpanish_Kaggle2 | ---
license: cc-by-nc-sa-4.0
---
This dataset was obtained from: https://www.kaggle.com/datasets/zulanac/fake-and-real-news |
turing-motors/Japanese-Heron-Bench | ---
size_categories:
- n<1K
task_categories:
- visual-question-answering
language:
- ja
---
# Japanese-Heron-Bench
## Dataset Description
**Japanese-Heron-Bench** is a benchmark for evaluating Japanese VLMs (Vision-Language Models). We collected 21 images related to Japan. We then set up three categories for each image: Conversation, Detail, and Complex, and prepared one or two questions for each category. The final evaluation dataset consists of 102 questions. Furthermore, each image is assigned one of seven subcategories: anime, art, culture, food, landscape, landmark, and transportation.
For more details and the run script, please visit to our [GitHub repository](https://github.com/turingmotors/heron).
## Uses
We have collected images that are either in the public domain or licensed under Creative Commons Attribution 1.0 (CC BY 1.0) or Creative Commons Attribution 2.0 (CC BY 2.0). Please refer to the [LICENSE.md](LICENCE.md) file for details on the licenses.
## Citation
```bibtex
@misc{inoue2024heronbench,
title={Heron-Bench: A Benchmark for Evaluating Vision Language Models in Japanese},
author={Yuichi Inoue and Kento Sasaki and Yuma Ochi and Kazuki Fujii and Kotaro Tanahashi and Yu Yamaguchi},
year={2024},
eprint={2404.07824},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
|
mikiyax/MusicCap | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: image
dtype: image
- name: tensor
sequence:
sequence: float32
splits:
- name: train
num_bytes: 2783267159.0
num_examples: 390
download_size: 1395248585
dataset_size: 2783267159.0
---
# Dataset Card for "MusicCap"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
karthik4709/train | ---
license: llama2
---
|
Query-of-CC/knowledge_pile_full | ---
license: apache-2.0
language:
- en
tags:
- knowledge
- cc
- Retrieval
- Reasoning
---
Knowledge Pile is a knowledge-related data leveraging [Query of CC](https://arxiv.org/abs/2401.14624),a total of 735GB disk size and 188B tokens (using Llama2 tokenizer).
## *Query of CC*
Just like the figure below, we initially collected seed information in some specific domains, such as keywords, frequently asked questions, and textbooks, to serve as inputs for the Query Bootstrapping stage. Leveraging the great generalization capability of large language models, we can effortlessly expand the initial seed information and extend it to an amount of domain-relevant queries. Inspiration from Self-instruct and WizardLM, we encompassed two stages of expansion, namely **Question Extension** and **Thought Generation**, which respectively extend the queries in terms of breadth and depth, for retrieving the domain-related data with a broader scope and deeper thought. Subsequently, based on the queries, we retrieved relevant documents from public corpora, and after performing operations such as duplicate data removal and filtering, we formed the final training dataset.

## **Knowledge Pile** Statistics
Based on *Query of CC* , we have formed a high-quality knowledge dataset **Knowledge Pile**. As shown in Figure below, comparing with other datasets in academic and mathematical reasoning domains, we have acquired a large-scale, high-quality knowledge dataset at a lower cost, without the need for manual intervention. Through automated query bootstrapping, we efficiently capture the information about the seed query. **Knowledge Pile** not only covers mathematical reasoning data but also encompasses rich knowledge-oriented corpora spanning various fields such as biology, physics, etc., enhancing its comprehensive research and application potential.
<img src="https://github.com/ngc7292/query_of_cc/blob/master/images/query_of_cc_timestamp_prop.png?raw=true" width="300px" style="center"/>
This table presents the top 10 web domains with the highest proportion of **Knowledge Pile**, primarily including academic websites, high-quality forums, and some knowledge domain sites. Table provides a breakdown of the data sources' timestamps in **Knowledge Pile**, with statistics conducted on an annual basis. It is evident that a significant portion of **Knowledge Pile** is sourced from recent years, with a decreasing proportion for earlier timestamps. This trend can be attributed to the exponential growth of internet data and the inherent timeliness introduced by the **Knowledge Pile**.
| **Web Domain** | **Count** |
|----------------------------|----------------|
|en.wikipedia.org | 398833 |
|www.semanticscholar.org | 141268 |
|slideplayer.com | 108177 |
|www.ncbi.nlm.nih.gov | 97009 |
|link.springer.com | 85357 |
|www.ipl.org | 84084 |
|pubmed.ncbi.nlm.nih.gov | 68934 |
|www.reference.com | 61658 |
|www.bartleby.com | 60097 |
|quizlet.com | 56752 |
### cite
```
@article{fei2024query,
title={Query of CC: Unearthing Large Scale Domain-Specific Knowledge from Public Corpora},
author={Fei, Zhaoye and Shao, Yunfan and Li, Linyang and Zeng, Zhiyuan and Yan, Hang and Qiu, Xipeng and Lin, Dahua},
journal={arXiv preprint arXiv:2401.14624},
year={2024}
}
``` |
scholarly-shadows-syndicate/hotpotqa_with_qa_gpt35 | ---
license: apache-2.0
---
# HotpotQA Dataset with GPT-3.5 Generated Questions
## Overview
This repository hosts an enhanced version of the HotpotQA dataset, where each supporting sentence in the dataset has been supplemented with questions generated using OpenAI's GPT-3.5 turbo API. The aim is to provide a richer context for each entry, potentially benefiting various NLP tasks, such as question answering and context understanding.
## Dataset Format
Each entry in the dataset is formatted as follows:
```json
{
"answer": "This is the answer",
"context": {
"sentences": [["Sent 1"], ["Sent 21", "Sent 22"]],
"title": ["Title1", "Title 2"],
"questions": [["Ques 1"], ["Ques 21", "Ques 22"]], // newly added
"paraphrased_questions": [["Para Ques 1"], ["Para Ques 21", "Para Ques 22"]], // newly added
},
"id": "000001",
"level": "medium",
"question": "What is the answer?",
"supporting_facts": {
"sent_id": [0, 1, 3],
"title": ["Title of para 1", "Title of para 2", "Title of para 3"]
},
"type": "comparison"
}
```
## Important Notices
### 1. Training Split Unavailability
As of now, the training split of this enhanced dataset is still under computation and is not available. We are actively working on this and will update the repository once it's ready.
### 2. Commercial Usage Caution
Users of this dataset should be aware that the questions generated by OpenAI's GPT-3.5 turbo API may not be suitable for commercial use, as per the OpenAI terms of service. We advise caution and suggest reviewing OpenAI's policies before any commercial deployment.
### 3. Citation for Original Dataset
This enhanced dataset is based on the HotpotQA dataset. Users of this enhanced dataset should also cite the original HotpotQA dataset. For more information about the original dataset, please visit [HotpotQA Dataset on Hugging Face](https://huggingface.co/datasets/hotpot_qa).
## Acknowledgements
This dataset enhancement was made possible by OpenAI's GPT-3.5 turbo API, and the original dataset was provided by the creators of HotpotQA. We thank both parties for their contributions to the field of natural language processing and machine learning.
|
yashraizad/yelp-open-dataset-checkin | ---
license: apache-2.0
---
|
AsphyXIA/wikipedia-kn | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 389760422
num_examples: 31437
download_size: 139254937
dataset_size: 389760422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingface-projects/auto-retrain-input-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ADONIS
'1': AFRICAN GIANT SWALLOWTAIL
'2': AMERICAN SNOOT
splits:
- name: train
num_bytes: 8825732.0
num_examples: 338
download_size: 8823395
dataset_size: 8825732.0
---
# Dataset Card for "input-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DZN222/x3 | ---
license: openrail
---
|
benayas/snips_artificial_10pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1125034
num_examples: 13084
download_size: 415472
dataset_size: 1125034
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
appvoid/no-prompt-15k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 37820576
num_examples: 15000
download_size: 20067913
dataset_size: 37820576
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# No Prompt
This is a dataset created to test language models on generating high-quality, useful text without prompt formatting. This works by simply removing the formatting from the dataset to be used, be it guanaco, openassistant, etc... |
kunal18/ScienceQA-filtered | ---
dataset_info:
features:
- name: image
dtype: image
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype: int8
- name: hint
dtype: string
- name: task
dtype: string
- name: grade
dtype: string
- name: subject
dtype: string
- name: topic
dtype: string
- name: category
dtype: string
- name: skill
dtype: string
- name: lecture
dtype: string
- name: solution
dtype: string
splits:
- name: train
num_bytes: 129830551.85934308
num_examples: 3914
- name: validation
num_bytes: 43876378.18627682
num_examples: 1328
- name: test
num_bytes: 39380154.55600094
num_examples: 1208
download_size: 392389887
dataset_size: 213087084.60162085
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
kheopss/prompt_coversation4 | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 11064226
num_examples: 1960
download_size: 3966772
dataset_size: 11064226
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RikoteMaster/isear_rauw | ---
dataset_info:
features:
- name: Emotion
dtype: string
- name: Text
dtype: string
- name: Text_processed
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3546767
num_examples: 5637
- name: test
num_bytes: 1177770
num_examples: 1879
download_size: 1908246
dataset_size: 4724537
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
lithium0003/findtextCenterNet_dataset | ---
license: mit
---
|
open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck | ---
pretty_name: Evaluation run of NoIdeaLand/test-2048-1500ck
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NoIdeaLand/test-2048-1500ck](https://huggingface.co/NoIdeaLand/test-2048-1500ck)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-14T04:39:40.489809](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck/blob/main/results_2023-09-14T04-39-40.489809.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26196111221791213,\n\
\ \"acc_stderr\": 0.03173586961427775,\n \"acc_norm\": 0.2653334325357461,\n\
\ \"acc_norm_stderr\": 0.03173833592722594,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062137,\n \"mc2\": 0.4095943166947606,\n\
\ \"mc2_stderr\": 0.014642509125225842\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.33532423208191126,\n \"acc_stderr\": 0.013796182947785564,\n\
\ \"acc_norm\": 0.36689419795221845,\n \"acc_norm_stderr\": 0.014084133118104294\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45817566221868156,\n\
\ \"acc_stderr\": 0.004972293764978723,\n \"acc_norm\": 0.6255725951005776,\n\
\ \"acc_norm_stderr\": 0.004829856058603573\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118366,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118366\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2986111111111111,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.2986111111111111,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380042,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380042\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438015,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438015\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.021765961672154523,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.021765961672154523\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22258064516129034,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.22258064516129034,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.19696969696969696,\n \"acc_stderr\": 0.02833560973246335,\n \"\
acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.02833560973246335\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.03324837939758159,\n\
\ \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.03324837939758159\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n\
\ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888239,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888239\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2119205298013245,\n \"acc_stderr\": 0.03336767086567977,\n \"\
acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.03336767086567977\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22018348623853212,\n \"acc_stderr\": 0.01776597865232757,\n \"\
acc_norm\": 0.22018348623853212,\n \"acc_norm_stderr\": 0.01776597865232757\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.028963702570791033,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.028963702570791033\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n\
\ \"acc_stderr\": 0.030236389942173095,\n \"acc_norm\": 0.3076923076923077,\n\
\ \"acc_norm_stderr\": 0.030236389942173095\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n\
\ \"acc_stderr\": 0.016117318166832265,\n \"acc_norm\": 0.2835249042145594,\n\
\ \"acc_norm_stderr\": 0.016117318166832265\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.02402774515526501,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.02402774515526501\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826507,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826507\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090202,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090202\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2627118644067797,\n\
\ \"acc_stderr\": 0.01124054551499566,\n \"acc_norm\": 0.2627118644067797,\n\
\ \"acc_norm_stderr\": 0.01124054551499566\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.025035845227711254,\n\
\ \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.025035845227711254\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27450980392156865,\n \"acc_stderr\": 0.0180540274588152,\n \
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.0180540274588152\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.024789071332007653,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.024789071332007653\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062137,\n \"mc2\": 0.4095943166947606,\n\
\ \"mc2_stderr\": 0.014642509125225842\n }\n}\n```"
repo_url: https://huggingface.co/NoIdeaLand/test-2048-1500ck
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|arc:challenge|25_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hellaswag|10_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T04-39-40.489809.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T04-39-40.489809.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T04-39-40.489809.parquet'
- config_name: results
data_files:
- split: 2023_09_14T04_39_40.489809
path:
- results_2023-09-14T04-39-40.489809.parquet
- split: latest
path:
- results_2023-09-14T04-39-40.489809.parquet
---
# Dataset Card for Evaluation run of NoIdeaLand/test-2048-1500ck
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NoIdeaLand/test-2048-1500ck
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NoIdeaLand/test-2048-1500ck](https://huggingface.co/NoIdeaLand/test-2048-1500ck) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-14T04:39:40.489809](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-2048-1500ck/blob/main/results_2023-09-14T04-39-40.489809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26196111221791213,
"acc_stderr": 0.03173586961427775,
"acc_norm": 0.2653334325357461,
"acc_norm_stderr": 0.03173833592722594,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062137,
"mc2": 0.4095943166947606,
"mc2_stderr": 0.014642509125225842
},
"harness|arc:challenge|25": {
"acc": 0.33532423208191126,
"acc_stderr": 0.013796182947785564,
"acc_norm": 0.36689419795221845,
"acc_norm_stderr": 0.014084133118104294
},
"harness|hellaswag|10": {
"acc": 0.45817566221868156,
"acc_stderr": 0.004972293764978723,
"acc_norm": 0.6255725951005776,
"acc_norm_stderr": 0.004829856058603573
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118366,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2986111111111111,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.2986111111111111,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.021765961672154523,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.021765961672154523
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22258064516129034,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.22258064516129034,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.02833560973246335,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.02833560973246335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.03324837939758159,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.03324837939758159
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654824,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654824
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888239,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888239
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.03336767086567977,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.03336767086567977
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22018348623853212,
"acc_stderr": 0.01776597865232757,
"acc_norm": 0.22018348623853212,
"acc_norm_stderr": 0.01776597865232757
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.028963702570791033,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.028963702570791033
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173095,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173095
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.016117318166832265,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.016117318166832265
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.02402774515526501,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.02402774515526501
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.025553169991826507,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.025553169991826507
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090202,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090202
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2627118644067797,
"acc_stderr": 0.01124054551499566,
"acc_norm": 0.2627118644067797,
"acc_norm_stderr": 0.01124054551499566
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.025035845227711254,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.025035845227711254
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.0180540274588152,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.0180540274588152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.024789071332007653,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.024789071332007653
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062137,
"mc2": 0.4095943166947606,
"mc2_stderr": 0.014642509125225842
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
senthilsk/crack_detection_dataset | ---
task_categories:
- object-detection
tags:
- roboflow
- roboflow2huggingface
---
<div align="center">
<img width="640" alt="senthilsk/crack_detection_dataset" src="https://huggingface.co/datasets/senthilsk/crack_detection_dataset/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['crack', 'mold', 'peeling_paint', 'stairstep_crack', 'water_seepage']
```
### Number of Images
```json
{'valid': 462, 'test': 225, 'train': 2263}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("senthilsk/crack_detection_dataset", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/objectdetection-qxiqx/detr_crack_dataset/dataset/1](https://universe.roboflow.com/objectdetection-qxiqx/detr_crack_dataset/dataset/1?ref=roboflow2huggingface)
### Citation
```
@misc{ detr_crack_dataset_dataset,
title = { detr_crack_dataset Dataset },
type = { Open Source Dataset },
author = { objectdetection },
howpublished = { \\url{ https://universe.roboflow.com/objectdetection-qxiqx/detr_crack_dataset } },
url = { https://universe.roboflow.com/objectdetection-qxiqx/detr_crack_dataset },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2024 },
month = { jan },
note = { visited on 2024-01-09 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on January 9, 2024 at 4:01 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand and search unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
For state of the art Computer Vision training notebooks you can use with this dataset,
visit https://github.com/roboflow/notebooks
To find over 100k other datasets and pre-trained models, visit https://universe.roboflow.com
The dataset includes 2950 images.
Cracks-AX10-cracks are annotated in COCO format.
The following pre-processing was applied to each image:
No image augmentation techniques were applied.
|
odunola/yoruba-audio-preprocessed-2 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 14139553174.75
num_examples: 11506
download_size: 5975711747
dataset_size: 14139553174.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cjvt/sloTS | ---
dataset_info:
features:
- name: complex
dtype: string
- name: simple
dtype: string
splits:
- name: train
num_bytes: 158705
num_examples: 973
download_size: 186255
dataset_size: 158705
language:
- sl
multilinguality:
- monolingual
license:
- cc-by-4.0
task_categories:
- text-generation
size_categories:
- n<1K
---
# Dataset Card for SloTS
### Dataset Summary
SloTS is a sentence simplification dataset containing 973 pairs of complex and simplified sentences.
In some cases one complex sentence is translated into multiple simplified sentences, or more complex sentences are translated into one simplified sentence.
### Languages
Slovenian.
## Dataset Structure
### Data Instances
A sample instance from the dataset:
```
{
'complex': 'Vsa vas je dobro vedela, da ga na svetu ni hudobnejšega človeka od Vrbarjevega Matevža .',
'simple': 'Matevž je bil zelo hudoben človek .'
}
```
### Data Fields
- 'complex': sentence in its complex form;
- 'simple': sentence in its simplified form.
## Additional Information
### Dataset Curators
Gorenc, Sabina and Robnik-Šikonja, Marko
### Licensing Information
CC BY 4.0
### Citation Information
```
@misc{sloTS,
title = {Slovene text simplification dataset {SloTS}},
author = {Gorenc, Sabina and Robnik-{\v S}ikonja, Marko},
url = {http://hdl.handle.net/11356/1682},
note = {Slovenian language resource repository {CLARIN}.{SI}},
copyright = {Creative Commons - Attribution 4.0 International ({CC} {BY} 4.0)},
year = {2022}
}
```
### Contributions
Thanks to Hana Skitek for adding this dataset.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.