datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
JamieWithofs/Deepfake-and-real-images-validation | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Fake
'1': Real
splits:
- name: validation
num_bytes: 221394513.687
num_examples: 2041
download_size: 225841664
dataset_size: 221394513.687
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
HydraLM/partitioned_v3_standardized_011 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 11869026.810806446
num_examples: 22073
download_size: 8319441
dataset_size: 11869026.810806446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_011"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Margaret-mmh/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sarpba/common_voice_16.1_hu_texts | ---
license: apache-2.0
---
SST with openai whisper large V3 for collect better part of the voices |
savaskaplan/sk-review-dataset-sample | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1336616.5478017514
num_examples: 3600
- name: validation
num_bytes: 148512.94975575016
num_examples: 400
download_size: 951377
dataset_size: 1485129.4975575015
---
# Dataset Card for "sk-review-dataset-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Asaf-Yehudai/HelpSteer_prompt_per_row | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: responses
list:
- name: response
dtype: string
- name: scores
struct:
- name: coherence
dtype: int64
- name: complexity
dtype: int64
- name: correctness
dtype: int64
- name: helpfulness
dtype: int64
- name: verbosity
dtype: int64
splits:
- name: train
num_bytes: 44115062
num_examples: 9944
- name: validation
num_bytes: 2267028
num_examples: 503
download_size: 25199197
dataset_size: 46382090
---
# Dataset Card for "HelpSteer_prompt_per_row"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JoaoJunior/java-encoded-small | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: rem
dtype: string
- name: add
dtype: string
- name: context
dtype: string
- name: meta
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2551158
num_examples: 800
- name: test
num_bytes: 641178
num_examples: 200
download_size: 391779
dataset_size: 3192336
---
# Dataset Card for "java-encoded-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Charles333/json_lama_chat_1000 | ---
license: apache-2.0
---
|
adamjweintraut/eli5_lfqa_best_slice | ---
dataset_info:
features:
- name: index
dtype: int64
- name: q_id
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: all_answers
sequence: string
- name: num_answers
dtype: int64
- name: top_answers
sequence: string
- name: num_top_answers
dtype: int64
- name: context
dtype: string
- name: orig
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 138199303
num_examples: 10000
- name: test
num_bytes: 17022480
num_examples: 1250
- name: validation
num_bytes: 17375258
num_examples: 1250
download_size: 103906913
dataset_size: 172597041
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
zolak/twitter_dataset_1712967588 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1335979
num_examples: 4497
download_size: 692271
dataset_size: 1335979
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zamal/SKIPPD | ---
license: other
---
This is a dataset containing sky images and its corresponding pv panel output data. This is meant to use for Educational and Research purpose only. Commercial use of this will cause legal actions.
|
Seongill/squad_conflict_v2_under_150_with_substitution_chunked | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: masked_query
dtype: string
- name: query_embedding
sequence: float64
- name: ent_type
dtype: string
- name: answer
dtype: string
- name: random_answer
dtype: string
- name: similar_answer
dtype: string
- name: rewritten_context
dtype: string
- name: has_answer
dtype: bool
- name: answer_sent
dtype: string
- name: rewritten_answer_sent
dtype: string
- name: answer_chunk
dtype: string
- name: rewritten_answer_chunk
dtype: string
splits:
- name: train
num_bytes: 238309199
num_examples: 25866
download_size: 153568266
dataset_size: 238309199
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wentingzhao/obqa | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 473240
num_examples: 5957
download_size: 318952
dataset_size: 473240
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CholeDYM/full_sh_for_train | ---
dataset_info:
features:
- name: lit_image
dtype: image
- name: delit_image
dtype: image
- name: normal_image
dtype: image
- name: text
dtype: string
- name: SH_idx
sequence: float32
- name: rot_idx
sequence: float32
- name: gamm_ang_S
sequence: int64
splits:
- name: train
num_bytes: 766509969.418
num_examples: 5999
download_size: 762466483
dataset_size: 766509969.418
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "full_sh_for_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hsiehpinghan/test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 0
dataset_size: 2464
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TheSleepyJo/seabream-freshness_v0 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 783931699.0
num_examples: 16
download_size: 51974320
dataset_size: 783931699.0
---
# Dataset Card for "seabream-freshness_v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jjzha/imdb-dutch-instruct | ---
language:
- nl
license:
- apache-2.0
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: template_lang
sequence: string
- name: template_id
dtype: int32
splits:
- name: train
num_examples: 24992
- name: test
num_examples: 24992
---
# Dataset Card for "imdb-dutch-instruct"
## Dataset Description
The original IMBD dataset was translated to Dutch with [yhavinga/ul2-large-en-nl](https://huggingface.co/yhavinga/ul2-large-en-nl).
Then, the dataset is converted to an instruct-style dataset with the following templates:
The instruction templates:
"Is deze recensie positief of negatief?",
"Wat is het sentiment van de recensie?",
"Wat voor toon heeft de volgende recensie?",
"Met wat voor sentiment zou je deze recensie beoordelen?"
The target templates:
"De recensie is",
"Gegeven de recensie, mijn antwoord is",
"Deze recensie is",
"De beoordeling hier is",
"Het antwoord is"
The template IDs are here:
```[
(0, 'Is deze recensie positief of negatief?', 'De recensie is'),
(1, 'Is deze recensie positief of negatief?', 'Gegeven de recensie, mijn antwoord is'),
(2, 'Is deze recensie positief of negatief?', 'Deze recensie is'),
(3, 'Is deze recensie positief of negatief?', 'De beoordeling hier is'),
(4, 'Is deze recensie positief of negatief?', 'Het antwoord is'),
(5, 'Wat is het sentiment van de recensie?', 'De recensie is'),
(6, 'Wat is het sentiment van de recensie?', 'Gegeven de recensie, mijn antwoord is'),
(7, 'Wat is het sentiment van de recensie?', 'Deze recensie is'),
(8, 'Wat is het sentiment van de recensie?', 'De beoordeling hier is'),
(9, 'Wat is het sentiment van de recensie?', 'Het antwoord is'),
(10, 'Wat voor toon heeft de volgende recensie?', 'De recensie is'),
(11, 'Wat voor toon heeft de volgende recensie?', 'Gegeven de recensie, mijn antwoord is'),
(12, 'Wat voor toon heeft de volgende recensie?', 'Deze recensie is'),
(13, 'Wat voor toon heeft de volgende recensie?', 'De beoordeling hier is'),
(14, 'Wat voor toon heeft de volgende recensie?', 'Het antwoord is'),
(15, 'Met wat voor sentiment zou je deze recensie beoordelen?', 'De recensie is'),
(16, 'Met wat voor sentiment zou je deze recensie beoordelen?', 'Gegeven de recensie, mijn antwoord is'),
(17, 'Met wat voor sentiment zou je deze recensie beoordelen?', 'Deze recensie is'),
(18, 'Met wat voor sentiment zou je deze recensie beoordelen?', 'De beoordeling hier is'),
(19, 'Met wat voor sentiment zou je deze recensie beoordelen?', 'Het antwoord is')
]```
### Dataset Summary
Large Movie Review Dataset translated to Dutch converted to instruct style.
This is a dataset for sentiment classification containing substantially more data than previous benchmark datasets.
### Languages and Example
This dataset contains Dutch data.
An example of 'train' looks as follows.
```
{
"inputs": "Is deze recensie positief of negatief?\n\nIk heb alle vier de films in deze serie gezien. Elke film wijkt steeds verder af van de boeken. Deze is de ergste tot nu toe. Mijn probleem is dat hij op geen enkele manier het boek volgt waar hij naar genoemd is! De regisseurs en producenten hadden hem een andere naam moeten geven dan 'Love's Abiding Joy'. Het enige aan deze film dat ook maar in de verte op het boek lijkt, zijn de namen van sommige personages (Willie, Missie, Henry, Clark, Scottie en Cookie). De namen/ouders/verzorgers van de kinderen kloppen niet. De hele verhaallijn staat nergens in het boek. '<br />Ik vind het een grote belediging voor Janette Oke, haar boeken en haar fans om een film onder haar titel te produceren die in geen enkel opzicht correct is. De muziek is te hard. De acteurs zijn niet overtuigend <0xE2><0x80><0x93> ze missen emoties.<br />Als je een goede familiefilm wilt, is dit misschien goed. Het is schoon. Maar kijk er niet naar, als je hoopt op een verkorte versie van het boek. Ik hoop dat dit de laatste film uit deze serie zal zijn, maar ik betwijfel het. Als er meer films worden gemaakt, zou ik willen dat Michael Landon jr. en anderen dichter bij de oorspronkelijke plot en verhaallijn zouden blijven. De boeken zijn uitstekend en als je ze goed leest, zijn het uitstekende films!",
"targets": "Het antwoord is negatief."}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `inputs`: a `string` feature, starting with a question whether the review is positive or negative.
- `targets`: a `string` feature, with a template prefix and the final label.
- `template_lang`: a `string` feature, which indicates which language the sentence is in.
- `template_id`: an `int` feature, which indicates which template has been used
### Data Splits
| name |train|test |
|----------|----:|----:|
|plain_text|24992|24992|
### Official Citation Information
The original data is from here: https://huggingface.co/datasets/yhavinga/imdb_dutch
```
@InProceedings{maas-EtAl:2011:ACL-HLT2011,
author = {Maas, Andrew L. and Daly, Raymond E. and Pham, Peter T. and Huang, Dan and Ng, Andrew Y. and Potts, Christopher},
title = {Learning Word Vectors for Sentiment Analysis},
booktitle = {Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies},
month = {June},
year = {2011},
address = {Portland, Oregon, USA},
publisher = {Association for Computational Linguistics},
pages = {142--150},
url = {http://www.aclweb.org/anthology/P11-1015}
}
```
Created by [Mike Zhang](https://jjzha.github.io/)
|
irds/msmarco-document_trec-dl-hard_fold2 | ---
pretty_name: '`msmarco-document/trec-dl-hard/fold2`'
viewer: false
source_datasets: ['irds/msmarco-document']
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-document/trec-dl-hard/fold2`
The `msmarco-document/trec-dl-hard/fold2` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-document#msmarco-document/trec-dl-hard/fold2).
# Data
This dataset provides:
- `queries` (i.e., topics); count=10
- `qrels`: (relevance assessments); count=1,345
- For `docs`, use [`irds/msmarco-document`](https://huggingface.co/datasets/irds/msmarco-document)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/msmarco-document_trec-dl-hard_fold2', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/msmarco-document_trec-dl-hard_fold2', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Mackie2021DlHard,
title={How Deep is your Learning: the DL-HARD Annotated Deep Learning Dataset},
author={Iain Mackie and Jeffrey Dalton and Andrew Yates},
journal={ArXiv},
year={2021},
volume={abs/2105.07975}
}
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
yzhuang/autotree_automl_10000_default-of-credit-card-clients_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 236440000
num_examples: 10000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 122258450
dataset_size: 472880000
---
# Dataset Card for "autotree_automl_10000_default-of-credit-card-clients_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Stevross/mmlu | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- multiple-choice-qa
paperswithcode_id: mmlu
pretty_name: Measuring Massive Multitask Language Understanding
language_bcp47:
- en-US
dataset_info:
- config_name: abstract_algebra
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 19328
num_examples: 100
- name: validation
num_bytes: 2024
num_examples: 11
- name: dev
num_bytes: 830
num_examples: 5
download_size: 166184960
dataset_size: 160623559
- config_name: anatomy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 33121
num_examples: 135
- name: validation
num_bytes: 3140
num_examples: 14
- name: dev
num_bytes: 967
num_examples: 5
download_size: 166184960
dataset_size: 160638605
- config_name: astronomy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 46771
num_examples: 152
- name: validation
num_bytes: 5027
num_examples: 16
- name: dev
num_bytes: 2076
num_examples: 5
download_size: 166184960
dataset_size: 160655251
- config_name: business_ethics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 33252
num_examples: 100
- name: validation
num_bytes: 3038
num_examples: 11
- name: dev
num_bytes: 2190
num_examples: 5
download_size: 166184960
dataset_size: 160639857
- config_name: clinical_knowledge
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 62754
num_examples: 265
- name: validation
num_bytes: 6664
num_examples: 29
- name: dev
num_bytes: 1210
num_examples: 5
download_size: 166184960
dataset_size: 160672005
- config_name: college_biology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 48797
num_examples: 144
- name: validation
num_bytes: 4819
num_examples: 16
- name: dev
num_bytes: 1532
num_examples: 5
download_size: 166184960
dataset_size: 160656525
- config_name: college_chemistry
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 24708
num_examples: 100
- name: validation
num_bytes: 2328
num_examples: 8
- name: dev
num_bytes: 1331
num_examples: 5
download_size: 166184960
dataset_size: 160629744
- config_name: college_computer_science
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 42641
num_examples: 100
- name: validation
num_bytes: 4663
num_examples: 11
- name: dev
num_bytes: 2765
num_examples: 5
download_size: 166184960
dataset_size: 160651446
- config_name: college_mathematics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 24711
num_examples: 100
- name: validation
num_bytes: 2668
num_examples: 11
- name: dev
num_bytes: 1493
num_examples: 5
download_size: 166184960
dataset_size: 160630249
- config_name: college_medicine
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 82397
num_examples: 173
- name: validation
num_bytes: 7909
num_examples: 22
- name: dev
num_bytes: 1670
num_examples: 5
download_size: 166184960
dataset_size: 160693353
- config_name: college_physics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 30181
num_examples: 102
- name: validation
num_bytes: 3490
num_examples: 11
- name: dev
num_bytes: 1412
num_examples: 5
download_size: 166184960
dataset_size: 160636460
- config_name: computer_security
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 27124
num_examples: 100
- name: validation
num_bytes: 4549
num_examples: 11
- name: dev
num_bytes: 1101
num_examples: 5
download_size: 166184960
dataset_size: 160634151
- config_name: conceptual_physics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 40709
num_examples: 235
- name: validation
num_bytes: 4474
num_examples: 26
- name: dev
num_bytes: 934
num_examples: 5
download_size: 166184960
dataset_size: 160647494
- config_name: econometrics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 46547
num_examples: 114
- name: validation
num_bytes: 4967
num_examples: 12
- name: dev
num_bytes: 1644
num_examples: 5
download_size: 166184960
dataset_size: 160654535
- config_name: electrical_engineering
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 25142
num_examples: 145
- name: validation
num_bytes: 2903
num_examples: 16
- name: dev
num_bytes: 972
num_examples: 5
download_size: 166184960
dataset_size: 160630394
- config_name: elementary_mathematics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 70108
num_examples: 378
- name: validation
num_bytes: 8988
num_examples: 41
- name: dev
num_bytes: 1440
num_examples: 5
download_size: 166184960
dataset_size: 160681913
- config_name: formal_logic
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 49785
num_examples: 126
- name: validation
num_bytes: 6252
num_examples: 14
- name: dev
num_bytes: 1757
num_examples: 5
download_size: 166184960
dataset_size: 160659171
- config_name: global_facts
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 18403
num_examples: 100
- name: validation
num_bytes: 1865
num_examples: 10
- name: dev
num_bytes: 1229
num_examples: 5
download_size: 166184960
dataset_size: 160622874
- config_name: high_school_biology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 109732
num_examples: 310
- name: validation
num_bytes: 11022
num_examples: 32
- name: dev
num_bytes: 1673
num_examples: 5
download_size: 166184960
dataset_size: 160723804
- config_name: high_school_chemistry
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 58464
num_examples: 203
- name: validation
num_bytes: 7092
num_examples: 22
- name: dev
num_bytes: 1220
num_examples: 5
download_size: 166184960
dataset_size: 160668153
- config_name: high_school_computer_science
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 44476
num_examples: 100
- name: validation
num_bytes: 3343
num_examples: 9
- name: dev
num_bytes: 2918
num_examples: 5
download_size: 166184960
dataset_size: 160652114
- config_name: high_school_european_history
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 270300
num_examples: 165
- name: validation
num_bytes: 29632
num_examples: 18
- name: dev
num_bytes: 11564
num_examples: 5
download_size: 166184960
dataset_size: 160912873
- config_name: high_school_geography
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 42034
num_examples: 198
- name: validation
num_bytes: 4332
num_examples: 22
- name: dev
num_bytes: 1403
num_examples: 5
download_size: 166184960
dataset_size: 160649146
- config_name: high_school_government_and_politics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 66074
num_examples: 193
- name: validation
num_bytes: 7063
num_examples: 21
- name: dev
num_bytes: 1779
num_examples: 5
download_size: 166184960
dataset_size: 160676293
- config_name: high_school_macroeconomics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 117687
num_examples: 390
- name: validation
num_bytes: 13020
num_examples: 43
- name: dev
num_bytes: 1328
num_examples: 5
download_size: 166184960
dataset_size: 160733412
- config_name: high_school_mathematics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 54854
num_examples: 270
- name: validation
num_bytes: 5765
num_examples: 29
- name: dev
num_bytes: 1297
num_examples: 5
download_size: 166184960
dataset_size: 160663293
- config_name: high_school_microeconomics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 75703
num_examples: 238
- name: validation
num_bytes: 7553
num_examples: 26
- name: dev
num_bytes: 1298
num_examples: 5
download_size: 166184960
dataset_size: 160685931
- config_name: high_school_physics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 59538
num_examples: 151
- name: validation
num_bytes: 6771
num_examples: 17
- name: dev
num_bytes: 1489
num_examples: 5
download_size: 166184960
dataset_size: 160669175
- config_name: high_school_psychology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 159407
num_examples: 545
- name: validation
num_bytes: 17269
num_examples: 60
- name: dev
num_bytes: 1905
num_examples: 5
download_size: 166184960
dataset_size: 160779958
- config_name: high_school_statistics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 110702
num_examples: 216
- name: validation
num_bytes: 9997
num_examples: 23
- name: dev
num_bytes: 2528
num_examples: 5
download_size: 166184960
dataset_size: 160724604
- config_name: high_school_us_history
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 296734
num_examples: 204
- name: validation
num_bytes: 31706
num_examples: 22
- name: dev
num_bytes: 8864
num_examples: 5
download_size: 166184960
dataset_size: 160938681
- config_name: high_school_world_history
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 378617
num_examples: 237
- name: validation
num_bytes: 45501
num_examples: 26
- name: dev
num_bytes: 4882
num_examples: 5
download_size: 166184960
dataset_size: 161030377
- config_name: human_aging
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 46098
num_examples: 223
- name: validation
num_bytes: 4707
num_examples: 23
- name: dev
num_bytes: 1008
num_examples: 5
download_size: 166184960
dataset_size: 160653190
- config_name: human_sexuality
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 32110
num_examples: 131
- name: validation
num_bytes: 2421
num_examples: 12
- name: dev
num_bytes: 1077
num_examples: 5
download_size: 166184960
dataset_size: 160636985
- config_name: international_law
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 53531
num_examples: 121
- name: validation
num_bytes: 6473
num_examples: 13
- name: dev
num_bytes: 2418
num_examples: 5
download_size: 166184960
dataset_size: 160663799
- config_name: jurisprudence
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 33986
num_examples: 108
- name: validation
num_bytes: 3729
num_examples: 11
- name: dev
num_bytes: 1303
num_examples: 5
download_size: 166184960
dataset_size: 160640395
- config_name: logical_fallacies
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 50117
num_examples: 163
- name: validation
num_bytes: 5103
num_examples: 18
- name: dev
num_bytes: 1573
num_examples: 5
download_size: 166184960
dataset_size: 160658170
- config_name: machine_learning
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 33880
num_examples: 112
- name: validation
num_bytes: 3232
num_examples: 11
- name: dev
num_bytes: 2323
num_examples: 5
download_size: 166184960
dataset_size: 160640812
- config_name: management
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 20002
num_examples: 103
- name: validation
num_bytes: 1820
num_examples: 11
- name: dev
num_bytes: 898
num_examples: 5
download_size: 166184960
dataset_size: 160624097
- config_name: marketing
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 63025
num_examples: 234
- name: validation
num_bytes: 7394
num_examples: 25
- name: dev
num_bytes: 1481
num_examples: 5
download_size: 166184960
dataset_size: 160673277
- config_name: medical_genetics
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 20864
num_examples: 100
- name: validation
num_bytes: 3005
num_examples: 11
- name: dev
num_bytes: 1089
num_examples: 5
download_size: 166184960
dataset_size: 160626335
- config_name: miscellaneous
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 147704
num_examples: 783
- name: validation
num_bytes: 14330
num_examples: 86
- name: dev
num_bytes: 699
num_examples: 5
download_size: 166184960
dataset_size: 160764110
- config_name: moral_disputes
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 107818
num_examples: 346
- name: validation
num_bytes: 12420
num_examples: 38
- name: dev
num_bytes: 1755
num_examples: 5
download_size: 166184960
dataset_size: 160723370
- config_name: moral_scenarios
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 374026
num_examples: 895
- name: validation
num_bytes: 42338
num_examples: 100
- name: dev
num_bytes: 2058
num_examples: 5
download_size: 166184960
dataset_size: 161019799
- config_name: nutrition
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 92410
num_examples: 306
- name: validation
num_bytes: 8436
num_examples: 33
- name: dev
num_bytes: 2085
num_examples: 5
download_size: 166184960
dataset_size: 160704308
- config_name: philosophy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 80073
num_examples: 311
- name: validation
num_bytes: 9184
num_examples: 34
- name: dev
num_bytes: 988
num_examples: 5
download_size: 166184960
dataset_size: 160691622
- config_name: prehistory
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 89594
num_examples: 324
- name: validation
num_bytes: 10285
num_examples: 35
- name: dev
num_bytes: 1878
num_examples: 5
download_size: 166184960
dataset_size: 160703134
- config_name: professional_accounting
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 124550
num_examples: 282
- name: validation
num_bytes: 14372
num_examples: 31
- name: dev
num_bytes: 2148
num_examples: 5
download_size: 166184960
dataset_size: 160742447
- config_name: professional_law
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 1891762
num_examples: 1534
- name: validation
num_bytes: 203519
num_examples: 170
- name: dev
num_bytes: 6610
num_examples: 5
download_size: 166184960
dataset_size: 162703268
- config_name: professional_medicine
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 217561
num_examples: 272
- name: validation
num_bytes: 23847
num_examples: 31
- name: dev
num_bytes: 3807
num_examples: 5
download_size: 166184960
dataset_size: 160846592
- config_name: professional_psychology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 225899
num_examples: 612
- name: validation
num_bytes: 29101
num_examples: 69
- name: dev
num_bytes: 2267
num_examples: 5
download_size: 166184960
dataset_size: 160858644
- config_name: public_relations
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 28760
num_examples: 110
- name: validation
num_bytes: 4566
num_examples: 12
- name: dev
num_bytes: 1496
num_examples: 5
download_size: 166184960
dataset_size: 160636199
- config_name: security_studies
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 204844
num_examples: 245
- name: validation
num_bytes: 22637
num_examples: 27
- name: dev
num_bytes: 5335
num_examples: 5
download_size: 166184960
dataset_size: 160834193
- config_name: sociology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 66243
num_examples: 201
- name: validation
num_bytes: 7184
num_examples: 22
- name: dev
num_bytes: 1613
num_examples: 5
download_size: 166184960
dataset_size: 160676417
- config_name: us_foreign_policy
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 28443
num_examples: 100
- name: validation
num_bytes: 3264
num_examples: 11
- name: dev
num_bytes: 1611
num_examples: 5
download_size: 166184960
dataset_size: 160634695
- config_name: virology
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 38759
num_examples: 166
- name: validation
num_bytes: 5463
num_examples: 18
- name: dev
num_bytes: 1096
num_examples: 5
download_size: 166184960
dataset_size: 160646695
- config_name: world_religions
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: auxiliary_train
num_bytes: 160601377
num_examples: 99842
- name: test
num_bytes: 25274
num_examples: 171
- name: validation
num_bytes: 2765
num_examples: 19
- name: dev
num_bytes: 670
num_examples: 5
download_size: 166184960
dataset_size: 160630086
---
# Dataset Card for MMLU
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository**: https://github.com/hendrycks/test
- **Paper**: https://arxiv.org/abs/2009.03300
### Dataset Summary
[Measuring Massive Multitask Language Understanding](https://arxiv.org/pdf/2009.03300) by [Dan Hendrycks](https://people.eecs.berkeley.edu/~hendrycks/), [Collin Burns](http://collinpburns.com), [Steven Basart](https://stevenbas.art), Andy Zou, Mantas Mazeika, [Dawn Song](https://people.eecs.berkeley.edu/~dawnsong/), and [Jacob Steinhardt](https://www.stat.berkeley.edu/~jsteinhardt/) (ICLR 2021).
This is a massive multitask test consisting of multiple-choice questions from various branches of knowledge. The test spans subjects in the humanities, social sciences, hard sciences, and other areas that are important for some people to learn. This covers 57 tasks including elementary mathematics, US history, computer science, law, and more. To attain high accuracy on this test, models must possess extensive world knowledge and problem solving ability.
A complete list of tasks: ['abstract_algebra', 'anatomy', 'astronomy', 'business_ethics', 'clinical_knowledge', 'college_biology', 'college_chemistry', 'college_computer_science', 'college_mathematics', 'college_medicine', 'college_physics', 'computer_security', 'conceptual_physics', 'econometrics', 'electrical_engineering', 'elementary_mathematics', 'formal_logic', 'global_facts', 'high_school_biology', 'high_school_chemistry', 'high_school_computer_science', 'high_school_european_history', 'high_school_geography', 'high_school_government_and_politics', 'high_school_macroeconomics', 'high_school_mathematics', 'high_school_microeconomics', 'high_school_physics', 'high_school_psychology', 'high_school_statistics', 'high_school_us_history', 'high_school_world_history', 'human_aging', 'human_sexuality', 'international_law', 'jurisprudence', 'logical_fallacies', 'machine_learning', 'management', 'marketing', 'medical_genetics', 'miscellaneous', 'moral_disputes', 'moral_scenarios', 'nutrition', 'philosophy', 'prehistory', 'professional_accounting', 'professional_law', 'professional_medicine', 'professional_psychology', 'public_relations', 'security_studies', 'sociology', 'us_foreign_policy', 'virology', 'world_religions']
### Supported Tasks and Leaderboards
| Model | Authors | Humanities | Social Science | STEM | Other | Average |
|------------------------------------|----------|:-------:|:-------:|:-------:|:-------:|:-------:|
| [UnifiedQA](https://arxiv.org/abs/2005.00700) | Khashabi et al., 2020 | 45.6 | 56.6 | 40.2 | 54.6 | 48.9
| [GPT-3](https://arxiv.org/abs/2005.14165) (few-shot) | Brown et al., 2020 | 40.8 | 50.4 | 36.7 | 48.8 | 43.9
| [GPT-2](https://arxiv.org/abs/2005.14165) | Radford et al., 2019 | 32.8 | 33.3 | 30.2 | 33.1 | 32.4
| Random Baseline | N/A | 25.0 | 25.0 | 25.0 | 25.0 | 25.0 | 25.0
### Languages
English
## Dataset Structure
### Data Instances
An example from anatomy subtask looks as follows:
```
{
"question": "What is the embryological origin of the hyoid bone?",
"choices": ["The first pharyngeal arch", "The first and second pharyngeal arches", "The second pharyngeal arch", "The second and third pharyngeal arches"],
"answer": "D"
}
```
### Data Fields
- `question`: a string feature
- `choices`: a list of 4 string features
- `answer`: a ClassLabel feature
### Data Splits
- `auxiliary_train`: auxiliary multiple-choice training questions from ARC, MC_TEST, OBQA, RACE, etc.
- `dev`: 5 examples per subtask, meant for few-shot setting
- `test`: there are at least 100 examples per subtask
| | auxiliary_train | dev | val | test |
| ----- | :------: | :-----: | :-----: | :-----: |
| TOTAL | 99842 | 285 | 1531 | 14042
## Dataset Creation
### Curation Rationale
Transformer models have driven this recent progress by pretraining on massive text corpora, including all of Wikipedia, thousands of books, and numerous websites. These models consequently see extensive information about specialized topics, most of which is not assessed by existing NLP benchmarks. To bridge the gap between the wide-ranging knowledge that models see during pretraining and the existing measures of success, we introduce a new benchmark for assessing models across a diverse set of subjects that humans learn.
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[MIT License](https://github.com/hendrycks/test/blob/master/LICENSE)
### Citation Information
If you find this useful in your research, please consider citing the test and also the [ETHICS](https://arxiv.org/abs/2008.02275) dataset it draws from:
```
@article{hendryckstest2021,
title={Measuring Massive Multitask Language Understanding},
author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt},
journal={Proceedings of the International Conference on Learning Representations (ICLR)},
year={2021}
}
@article{hendrycks2021ethics,
title={Aligning AI With Shared Human Values},
author={Dan Hendrycks and Collin Burns and Steven Basart and Andrew Critch and Jerry Li and Dawn Song and Jacob Steinhardt},
journal={Proceedings of the International Conference on Learning Representations (ICLR)},
year={2021}
}
```
### Contributions
Thanks to [@andyzoujm](https://github.com/andyzoujm) for adding this dataset.
|
category3/PDBookCovers | ---
license: cc0-1.0
---
|
micsell/common_voice_en | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 283633490.0
num_examples: 8000
- name: test
num_bytes: 82967419.0
num_examples: 2000
download_size: 368007132
dataset_size: 366600909.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AdapterOcean/math_dataset_standardized_cluster_3_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 28168082
num_examples: 18654
download_size: 13122952
dataset_size: 28168082
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "math_dataset_standardized_cluster_3_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atsushi3110/cross-lingual-openorcha-830k-en-ja | ---
license: cc-by-sa-4.0
---
|
BirdL/DalleCatsAndDogs | ---
dataset_info:
features:
- name: Images
dtype: image
- name: class
dtype: string
splits:
- name: train
num_bytes: 49662722.0
num_examples: 500
download_size: 49664703
dataset_size: 49662722.0
---
# Dataset Card for "DalleCatsAndDogs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_abideen__gemma-2b-openhermes | ---
pretty_name: Evaluation run of abideen/gemma-2b-openhermes
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abideen/gemma-2b-openhermes](https://huggingface.co/abideen/gemma-2b-openhermes)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__gemma-2b-openhermes\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T12:05:01.077115](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__gemma-2b-openhermes/blob/main/results_2024-02-22T12-05-01.077115.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.37696524742191334,\n\
\ \"acc_stderr\": 0.03381316358729798,\n \"acc_norm\": 0.3815378335823341,\n\
\ \"acc_norm_stderr\": 0.03461953317836164,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.458323806326475,\n\
\ \"mc2_stderr\": 0.015931044127458407\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436172,\n\
\ \"acc_norm\": 0.439419795221843,\n \"acc_norm_stderr\": 0.01450374782358013\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4810794662417845,\n\
\ \"acc_stderr\": 0.00498620758186293,\n \"acc_norm\": 0.627365066719777,\n\
\ \"acc_norm_stderr\": 0.004825179407757572\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.42641509433962266,\n \"acc_stderr\": 0.030437794342983042,\n\
\ \"acc_norm\": 0.42641509433962266,\n \"acc_norm_stderr\": 0.030437794342983042\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761923,\n\
\ \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761923\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707546,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707546\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.46060606060606063,\n \"acc_stderr\": 0.03892207016552012,\n\
\ \"acc_norm\": 0.46060606060606063,\n \"acc_norm_stderr\": 0.03892207016552012\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4595959595959596,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.4595959595959596,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442207,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442207\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2,\n \"acc_stderr\": 0.024388430433987664,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.024388430433987664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.3403361344537815,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.3403361344537815,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5100917431192661,\n \"acc_stderr\": 0.021432956203453316,\n \"\
acc_norm\": 0.5100917431192661,\n \"acc_norm_stderr\": 0.021432956203453316\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057986,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057986\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4215686274509804,\n \"acc_stderr\": 0.03465868196380758,\n \"\
acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.03465868196380758\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5189873417721519,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.5189873417721519,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.42748091603053434,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.42748091603053434,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.04560456086387235,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.04560456086387235\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.044642857142857116,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.044642857142857116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.44660194174757284,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.44660194174757284,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n\
\ \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.594017094017094,\n\
\ \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.46360153256704983,\n\
\ \"acc_stderr\": 0.01783252407959326,\n \"acc_norm\": 0.46360153256704983,\n\
\ \"acc_norm_stderr\": 0.01783252407959326\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.407514450867052,\n \"acc_stderr\": 0.0264545781469315,\n\
\ \"acc_norm\": 0.407514450867052,\n \"acc_norm_stderr\": 0.0264545781469315\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.01455155365936992,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.01455155365936992\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.02849199358617157,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.02849199358617157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40192926045016075,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.40192926045016075,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.027339546640662727,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.027339546640662727\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022135,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022135\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31877444589308995,\n\
\ \"acc_stderr\": 0.0119018956357861,\n \"acc_norm\": 0.31877444589308995,\n\
\ \"acc_norm_stderr\": 0.0119018956357861\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.380718954248366,\n \"acc_stderr\": 0.019643801557924806,\n \
\ \"acc_norm\": 0.380718954248366,\n \"acc_norm_stderr\": 0.019643801557924806\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.47346938775510206,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.47346938775510206,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673281,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673281\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03811079669833531,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03811079669833531\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.458323806326475,\n\
\ \"mc2_stderr\": 0.015931044127458407\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6093133385951065,\n \"acc_stderr\": 0.01371253603655665\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.056103108415466264,\n \
\ \"acc_stderr\": 0.006338668431321893\n }\n}\n```"
repo_url: https://huggingface.co/abideen/gemma-2b-openhermes
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-05-01.077115.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-05-01.077115.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- '**/details_harness|winogrande|5_2024-02-22T12-05-01.077115.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T12-05-01.077115.parquet'
- config_name: results
data_files:
- split: 2024_02_22T12_05_01.077115
path:
- results_2024-02-22T12-05-01.077115.parquet
- split: latest
path:
- results_2024-02-22T12-05-01.077115.parquet
---
# Dataset Card for Evaluation run of abideen/gemma-2b-openhermes
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/gemma-2b-openhermes](https://huggingface.co/abideen/gemma-2b-openhermes) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__gemma-2b-openhermes",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T12:05:01.077115](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__gemma-2b-openhermes/blob/main/results_2024-02-22T12-05-01.077115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.37696524742191334,
"acc_stderr": 0.03381316358729798,
"acc_norm": 0.3815378335823341,
"acc_norm_stderr": 0.03461953317836164,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.458323806326475,
"mc2_stderr": 0.015931044127458407
},
"harness|arc:challenge|25": {
"acc": 0.4044368600682594,
"acc_stderr": 0.014342036483436172,
"acc_norm": 0.439419795221843,
"acc_norm_stderr": 0.01450374782358013
},
"harness|hellaswag|10": {
"acc": 0.4810794662417845,
"acc_stderr": 0.00498620758186293,
"acc_norm": 0.627365066719777,
"acc_norm_stderr": 0.004825179407757572
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.42641509433962266,
"acc_stderr": 0.030437794342983042,
"acc_norm": 0.42641509433962266,
"acc_norm_stderr": 0.030437794342983042
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761923,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761923
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707546,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707546
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.46060606060606063,
"acc_stderr": 0.03892207016552012,
"acc_norm": 0.46060606060606063,
"acc_norm_stderr": 0.03892207016552012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4595959595959596,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.4595959595959596,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442207,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442207
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.024388430433987664,
"acc_norm": 0.2,
"acc_norm_stderr": 0.024388430433987664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3403361344537815,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.3403361344537815,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5100917431192661,
"acc_stderr": 0.021432956203453316,
"acc_norm": 0.5100917431192661,
"acc_norm_stderr": 0.021432956203453316
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.027467401804057986,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.027467401804057986
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.03465868196380758,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.03465868196380758
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5189873417721519,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.5189873417721519,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.42748091603053434,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.42748091603053434,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.04560456086387235,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.04560456086387235
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857116,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857116
},
"harness|hendrycksTest-management|5": {
"acc": 0.44660194174757284,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.44660194174757284,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.46360153256704983,
"acc_stderr": 0.01783252407959326,
"acc_norm": 0.46360153256704983,
"acc_norm_stderr": 0.01783252407959326
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.407514450867052,
"acc_stderr": 0.0264545781469315,
"acc_norm": 0.407514450867052,
"acc_norm_stderr": 0.0264545781469315
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.01455155365936992,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.01455155365936992
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.02849199358617157,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.02849199358617157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40192926045016075,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.40192926045016075,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.027339546640662727,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.027339546640662727
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022135,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022135
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31877444589308995,
"acc_stderr": 0.0119018956357861,
"acc_norm": 0.31877444589308995,
"acc_norm_stderr": 0.0119018956357861
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.380718954248366,
"acc_stderr": 0.019643801557924806,
"acc_norm": 0.380718954248366,
"acc_norm_stderr": 0.019643801557924806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.47346938775510206,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.47346938775510206,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673281,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673281
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.458323806326475,
"mc2_stderr": 0.015931044127458407
},
"harness|winogrande|5": {
"acc": 0.6093133385951065,
"acc_stderr": 0.01371253603655665
},
"harness|gsm8k|5": {
"acc": 0.056103108415466264,
"acc_stderr": 0.006338668431321893
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
andrewsunanda/fast_food_image_classification | ---
task_categories:
- image-classification
language:
- en
--- |
james-burton/aug-text-exps-v3 | ---
dataset_info:
features:
- name: model_name
dtype: string
- name: predicted_class
dtype: string
- name: task_name
dtype: string
- name: narration
dtype: string
- name: values
sequence: string
- name: sign
sequence: string
- name: narrative_id
dtype: int32
- name: unique_id
dtype: int32
- name: classes_dict
dtype: string
- name: narrative_questions
sequence: string
- name: feature_nums
sequence: string
- name: ft_num2name
dtype: string
- name: old2new_ft_nums
dtype: string
- name: old2new_classes
dtype: string
- name: predicted_class_label
dtype: string
- name: class2name
dtype: string
splits:
- name: train
num_bytes: 8651458
num_examples: 3280
- name: validation
num_bytes: 121591
num_examples: 47
- name: test
num_bytes: 252513
num_examples: 94
download_size: 2382860
dataset_size: 9025562
---
# Dataset Card for "aug-text-exps-v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VedCodes/New_dataset_llm | ---
task_categories:
- text-generation
language:
- en
tags:
- medical
size_categories:
- n<1K
--- |
ChirathD/sinCorpus | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 99188239
num_examples: 43328
download_size: 41545918
dataset_size: 99188239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Azazelle__SlimMelodicMaid | ---
pretty_name: Evaluation run of Azazelle/SlimMelodicMaid
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azazelle/SlimMelodicMaid](https://huggingface.co/Azazelle/SlimMelodicMaid) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__SlimMelodicMaid\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T02:54:45.572792](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__SlimMelodicMaid/blob/main/results_2023-12-30T02-54-45.572792.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495626338556433,\n\
\ \"acc_stderr\": 0.03193676074571155,\n \"acc_norm\": 0.6515021650371259,\n\
\ \"acc_norm_stderr\": 0.03257111121158258,\n \"mc1\": 0.43084455324357407,\n\
\ \"mc1_stderr\": 0.017335272475332363,\n \"mc2\": 0.6087927851947197,\n\
\ \"mc2_stderr\": 0.015566919235032412\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916576,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537302\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6796454889464251,\n\
\ \"acc_stderr\": 0.0046565916786067574,\n \"acc_norm\": 0.8600876319458275,\n\
\ \"acc_norm_stderr\": 0.0034618713240671954\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02390115797940253,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02390115797940253\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.02880139219363127,\n \
\ \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.02880139219363127\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834829,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834829\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.01618544417945717,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.01618544417945717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4869621903520209,\n\
\ \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.4869621903520209,\n\
\ \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740543,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740543\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.0193533605475537,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.0193533605475537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43084455324357407,\n\
\ \"mc1_stderr\": 0.017335272475332363,\n \"mc2\": 0.6087927851947197,\n\
\ \"mc2_stderr\": 0.015566919235032412\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6080363912054587,\n \
\ \"acc_stderr\": 0.013447140886023815\n }\n}\n```"
repo_url: https://huggingface.co/Azazelle/SlimMelodicMaid
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-54-45.572792.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- '**/details_harness|winogrande|5_2023-12-30T02-54-45.572792.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T02-54-45.572792.parquet'
- config_name: results
data_files:
- split: 2023_12_30T02_54_45.572792
path:
- results_2023-12-30T02-54-45.572792.parquet
- split: latest
path:
- results_2023-12-30T02-54-45.572792.parquet
---
# Dataset Card for Evaluation run of Azazelle/SlimMelodicMaid
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/SlimMelodicMaid](https://huggingface.co/Azazelle/SlimMelodicMaid) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__SlimMelodicMaid",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:54:45.572792](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__SlimMelodicMaid/blob/main/results_2023-12-30T02-54-45.572792.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495626338556433,
"acc_stderr": 0.03193676074571155,
"acc_norm": 0.6515021650371259,
"acc_norm_stderr": 0.03257111121158258,
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332363,
"mc2": 0.6087927851947197,
"mc2_stderr": 0.015566919235032412
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916576,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537302
},
"harness|hellaswag|10": {
"acc": 0.6796454889464251,
"acc_stderr": 0.0046565916786067574,
"acc_norm": 0.8600876319458275,
"acc_norm_stderr": 0.0034618713240671954
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02390115797940253,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02390115797940253
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.02880139219363127,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.02880139219363127
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834829,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.01618544417945717,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.01618544417945717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4869621903520209,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.4869621903520209,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.027576468622740543,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.027576468622740543
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.0193533605475537,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.0193533605475537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43084455324357407,
"mc1_stderr": 0.017335272475332363,
"mc2": 0.6087927851947197,
"mc2_stderr": 0.015566919235032412
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090254
},
"harness|gsm8k|5": {
"acc": 0.6080363912054587,
"acc_stderr": 0.013447140886023815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
IceMasterT/BTC-Data-Daily-2014-2023 | ---
license: mit
task_categories:
- token-classification
- text-classification
language:
- en
tags:
- finance
pretty_name: Bitcoin Data
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/dusevnyj_neuralcloud | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dusevnyj (Neural Cloud)
This is the dataset of dusevnyj (Neural Cloud), containing 12 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
HydraIndicLM/odia_alpaca_dolly_67k | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 117352849
num_examples: 64389
download_size: 44003356
dataset_size: 117352849
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## About
This repo contains a 67K instruction set for Odia, translated from Alpaca and Dolly.
## Citation
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{OdiaAlpacaDolly,
author = {Sambit Sekhar and Shantipriya Parida},
title = {Odia Instruction Set Based on Alpaca and Dolly},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
|
heliosprime/twitter_dataset_1712970354 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6184
num_examples: 14
download_size: 8268
dataset_size: 6184
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712970354"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlignmentResearch/IMDB | ---
dataset_info:
- config_name: default
features:
- name: text
dtype: string
- name: clf_label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: chunked_text
sequence: string
splits:
- name: train
num_bytes: 61222150
num_examples: 24477
- name: validation
num_bytes: 60091078
num_examples: 24513
download_size: 79107768
dataset_size: 121313228
- config_name: neg
features:
- name: text
dtype: string
- name: clf_label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: chunked_text
sequence: string
splits:
- name: train
num_bytes: 30669853.46651959
num_examples: 12262
- name: validation
num_bytes: 30098244.020886876
num_examples: 12278
download_size: 39410373
dataset_size: 60768097.48740646
- config_name: pos
features:
- name: text
dtype: string
- name: clf_label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: chunked_text
sequence: string
splits:
- name: train
num_bytes: 30552296.53348041
num_examples: 12215
- name: validation
num_bytes: 29992833.979113124
num_examples: 12235
download_size: 39739964
dataset_size: 60545130.51259354
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- config_name: neg
data_files:
- split: train
path: neg/train-*
- split: validation
path: neg/validation-*
- config_name: pos
data_files:
- split: train
path: pos/train-*
- split: validation
path: pos/validation-*
---
|
Freela/leom | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_qqp_definite_for_indefinite_articles | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2270650
num_examples: 13219
- name: test
num_bytes: 23668466
num_examples: 137338
- name: train
num_bytes: 20579019
num_examples: 119383
download_size: 28797821
dataset_size: 46518135
---
# Dataset Card for "MULTI_VALUE_qqp_definite_for_indefinite_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sofoklis/stem_like | ---
dataset_info:
features:
- name: number
dtype: int64
- name: name
dtype: string
- name: sequence
dtype: string
- name: spaced_sequence
dtype: string
- name: array
sequence:
sequence: float64
- name: image
dtype: image
splits:
- name: train
num_bytes: 347058.9
num_examples: 90
- name: test
num_bytes: 38562.1
num_examples: 10
- name: valid
num_bytes: 69411.78
num_examples: 18
download_size: 97724
dataset_size: 455032.78
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
rntc/few_shot_ncbi_disease_wikipedia | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: gold
dtype: string
- name: doc_id
dtype: int64
- name: sent_offset
sequence: int64
- name: sent_len
dtype: int64
- name: context
dtype: string
splits:
- name: train
num_bytes: 4358257
num_examples: 978
download_size: 664033
dataset_size: 4358257
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shivam9980/Gemma-news-hindi | ---
license: apache-2.0
---
|
kaleemWaheed/twitter_dataset_1713105477 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 27976
num_examples: 66
download_size: 16111
dataset_size: 27976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alvations/c4p0-x1-en-ko | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: string
- name: url
dtype: string
- name: doc_hash
dtype: string
splits:
- name: train
num_bytes: 91997
num_examples: 105
download_size: 50077
dataset_size: 91997
configs:
- config_name: default
data_files:
- split: train
path: 3d8d18c1775c05f6/train-*
---
|
result-kand2-sdxl-wuerst-karlo/16193144 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 169
num_examples: 10
download_size: 1324
dataset_size: 169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "16193144"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CM/codexglue_codetrans | ---
dataset_info:
features:
- name: id
dtype: int32
- name: java
dtype: string
- name: cs
dtype: string
splits:
- name: train
num_bytes: 4372641
num_examples: 10300
- name: validation
num_bytes: 226407
num_examples: 500
- name: test
num_bytes: 418587
num_examples: 1000
download_size: 0
dataset_size: 5017635
---
# Dataset Card for "codexglue_codetrans"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shreevigneshs/iwslt-2023-en-es-train-val-split-0.1 | ---
dataset_info:
features:
- name: en
dtype: string
- name: es
dtype: string
- name: es_annotated
dtype: string
- name: styles
dtype: int64
splits:
- name: train
num_bytes: 258566.0
num_examples: 720
- name: val
num_bytes: 28014.0
num_examples: 80
- name: if_test
num_bytes: 225583.0
num_examples: 600
- name: f_test
num_bytes: 225065.0
num_examples: 600
download_size: 283452
dataset_size: 737228.0
---
# Dataset Card for "iwslt-2023-en-es-train-val-split-0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/citrus | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Citrus
This is the image base of bangumi Citrus, we detected 18 characters, 1393 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 374 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 58 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 49 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 29 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 17 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 73 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 241 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 30 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 97 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 15 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 7 | [Download](10/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 11 | 24 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 31 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 11 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 90 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 76 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 44 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 127 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Nadav/pixel_glue_qqp_high_noise | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: validation
num_bytes: 1445009192.25
num_examples: 40430
download_size: 1443796278
dataset_size: 1445009192.25
---
# Dataset Card for "pixel_glue_qqp_high_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tr416/dataset_20231006_234030 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73965
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_234030"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CHEN0312/gssgd | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_qqp_a_ing | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1769080
num_examples: 9569
- name: test
num_bytes: 17265757
num_examples: 94216
- name: train
num_bytes: 15748086
num_examples: 84826
download_size: 21690691
dataset_size: 34782923
---
# Dataset Card for "MULTI_VALUE_qqp_a_ing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vinicm/modelomichelle | ---
license: openrail
---
|
CyberHarem/kamisato_ayaka_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kamisato_ayaka/神里綾華/神里绫华 (Genshin Impact)
This is the dataset of kamisato_ayaka/神里綾華/神里绫华 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `blue_eyes, blunt_bangs, long_hair, ribbon, ponytail, hair_ribbon, blue_hair, sidelocks, mole_under_eye, mole, hair_ornament, breasts, tress_ribbon, light_blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.36 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kamisato_ayaka_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kamisato_ayaka_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1400 | 2.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kamisato_ayaka_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kamisato_ayaka_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, white_dress, bare_shoulders, blush, solo, smile, closed_mouth, medium_breasts, sleeveless_dress, bow, white_hair, bare_arms, blunt_tresses, cleavage, collarbone, spaghetti_strap, flower, outdoors, very_long_hair |
| 1 | 47 |  |  |  |  |  | 1girl, breastplate, japanese_armor, japanese_clothes, solo, folding_fan, holding_fan, looking_at_viewer, armored_dress, smile, arm_guards, gloves, bridal_gauntlets, closed_mouth, blush, petals, tassel, blue_skirt |
| 2 | 12 |  |  |  |  |  | 1girl, breastplate, holding_sword, japanese_armor, japanese_clothes, solo, katana, looking_at_viewer, arm_guards, tassel, armored_dress, bridal_gauntlets, blue_skirt, closed_mouth, gloves, petals |
| 3 | 22 |  |  |  |  |  | 1girl, solo, looking_at_viewer, wide_sleeves, long_sleeves, obi, blue_kimono, floral_print, smile, hair_flower, blush, closed_mouth, folding_fan, holding_fan, alternate_costume, very_long_hair |
| 4 | 8 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, blush, chest_sarashi, cleavage, medium_breasts, closed_mouth, off_shoulder, indoors, kimono, skirt, very_long_hair, collarbone, cup, large_breasts, petals, smile, tassel, thighs |
| 5 | 22 |  |  |  |  |  | 1girl, white_shirt, looking_at_viewer, pleated_skirt, serafuku, solo, hair_bow, long_sleeves, blue_skirt, blush, smile, alternate_costume, blunt_tresses, white_hair, white_sailor_collar, school_bag, outdoors, blue_sky, cowboy_shot, closed_mouth, day, blue_neckerchief, cloud, thighs, very_long_hair |
| 6 | 15 |  |  |  |  |  | 1girl, looking_at_viewer, solo, thighs, bare_shoulders, navel, outdoors, stomach, blush, cleavage, very_long_hair, collarbone, blue_sky, cowboy_shot, water, alternate_costume, day, wet, bare_arms, cloud, large_breasts, medium_breasts, smile, white_hair, flower_knot, halterneck, ocean, parted_lips, standing, white_bikini, blunt_tresses |
| 7 | 32 |  |  |  |  |  | 1girl, blue_dress, butterfly_hair_ornament, hair_flower, official_alternate_costume, official_alternate_hairstyle, solo, puffy_long_sleeves, braid, blunt_tresses, brown_headwear, looking_at_viewer, white_collar, multicolored_dress, smile, hat_flower, white_pantyhose, blush, medium_breasts, holding, neck_tassel, outdoors, white_flower, back_bow, blue_butterfly, closed_mouth, hand_up, petals |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | alternate_costume | looking_at_viewer | white_dress | bare_shoulders | blush | solo | smile | closed_mouth | medium_breasts | sleeveless_dress | bow | white_hair | bare_arms | blunt_tresses | cleavage | collarbone | spaghetti_strap | flower | outdoors | very_long_hair | breastplate | japanese_armor | japanese_clothes | folding_fan | holding_fan | armored_dress | arm_guards | gloves | bridal_gauntlets | petals | tassel | blue_skirt | holding_sword | katana | wide_sleeves | long_sleeves | obi | blue_kimono | floral_print | hair_flower | chest_sarashi | off_shoulder | indoors | kimono | skirt | cup | large_breasts | thighs | white_shirt | pleated_skirt | serafuku | hair_bow | white_sailor_collar | school_bag | blue_sky | cowboy_shot | day | blue_neckerchief | cloud | navel | stomach | water | wet | flower_knot | halterneck | ocean | parted_lips | standing | white_bikini | blue_dress | butterfly_hair_ornament | official_alternate_costume | official_alternate_hairstyle | puffy_long_sleeves | braid | brown_headwear | white_collar | multicolored_dress | hat_flower | white_pantyhose | holding | neck_tassel | white_flower | back_bow | blue_butterfly | hand_up |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:--------------|:-----------------|:--------|:-------|:--------|:---------------|:-----------------|:-------------------|:------|:-------------|:------------|:----------------|:-----------|:-------------|:------------------|:---------|:-----------|:-----------------|:--------------|:-----------------|:-------------------|:--------------|:--------------|:----------------|:-------------|:---------|:-------------------|:---------|:---------|:-------------|:----------------|:---------|:---------------|:---------------|:------|:--------------|:---------------|:--------------|:----------------|:---------------|:----------|:---------|:--------|:------|:----------------|:---------|:--------------|:----------------|:-----------|:-----------|:----------------------|:-------------|:-----------|:--------------|:------|:-------------------|:--------|:--------|:----------|:--------|:------|:--------------|:-------------|:--------|:--------------|:-----------|:---------------|:-------------|:--------------------------|:-----------------------------|:-------------------------------|:---------------------|:--------|:-----------------|:---------------|:---------------------|:-------------|:------------------|:----------|:--------------|:---------------|:-----------|:-----------------|:----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 47 |  |  |  |  |  | X | | X | | | X | X | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | X | | | | X | | X | | | | | | | | | | | | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 22 |  |  |  |  |  | X | X | X | | | X | X | X | X | | | | | | | | | | | | X | | | | X | X | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | | X | X | X | X | X | X | | | | | | X | X | | | | X | | | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 22 |  |  |  |  |  | X | X | X | | | X | X | X | X | | | | X | | X | | | | | X | X | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | | | X | X | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 7 | 32 |  |  |  |  |  | X | | X | | | X | X | X | X | X | | | | | X | | | | | X | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Nyameri/AIXDR | ---
license: mit
task_categories:
- summarization
- feature-extraction
pretty_name: AI threat Hunter's playbook
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
<!-- AI XDR playbook -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- AI xdr paper
XDR (Extended Detection and Response) is a security solution that combines multiple detection and response technologies to provide a more comprehensive view of an organization's security posture, making it easier to recognize and respond to potential threats[1]. AI/ML (Artificial Intelligence/Machine Learning) is a key component of XDR, as it enables advanced analytics techniques to identify potential threats and automate response actions[1][2]. Here are some ways in which AI enhances XDR platforms:
- **Advanced analytics**: XDR solutions use advanced analytics techniques supported by machine learning (ML) models to identify potential threats and automate response actions[1][5].
- **Automated response**: XDR solutions can automatically block or quarantine malicious files and alert security teams to potential incidents[1].
- **Single pane of glass view**: XDR solutions provide a unified view of all security events and incidents, making it easier for security teams to investigate and respond to threats[1].
- **Detecting unknown or zero-day threats**: AI-powered XDR solutions can detect unknown or zero-day threats, making them more effective than traditional detection and response technologies that rely on rule-based or signature-based detection methods[1][5].
- **Predicting future cyberattacks**: AI is able to predict future cyberattacks and identify their mechanisms to determine their origin, accelerating responses to attacks[5].
XDR platforms with AI can perform analyses on every layer of an organization's infrastructure, including those that were previously inaccessible to analysts[5]. AI analyzes logs and compares current activities on an organization's infrastructure to detect any unusual action on all its infrastructures, including servers, workstations, and networks[5]. Additionally, an AI-powered XDR with Next Generation Antivirus (NGAV) can detect unknown malicious files[5]. If an anomaly is detected, the sensors immediately send the information back to the XDR, which can automatically prioritize alerts so that security teams can immediately respond to potential threats[5].
Citations:
[1] Machine Learning and Artificial Intelligence (AI/ML): The Secret Sauce Behind XDR https://www.computer.org/publications/tech-news/trends/the-secret-sauce-behind-xdr/
[2] AI-Driven XDR: Defeating the Most Complex Attack Sequences - Cybereason https://www.cybereason.com/blog/ai-driven-xdr-defeating-the-most-complex-attack-sequences
[3] Harnessing the Power of AI-Driven XDR - Cybereason https://www.cybereason.com/blog/harnessing-the-power-of-ai-driven-xdr
[4] Explainable dimensionality reduction (XDR) to unbox AI 'black box' models: A study of AI perspectives on the ethnic styles of village dwellings - Nature https://www.nature.com/articles/s41599-023-01505-4
[5] How does AI enhance XDR platforms? - TEHTRIS https://tehtris.com/en/blog/how-does-ai-enhance-xdr-platforms
[6] XDR Should Be Viewed as An Open Architecture - Vectra AI https://www.vectra.ai/resources/research-reports/esg-xdr-open-architecture
By Perplexity at https://www.perplexity.ai/search/fd37ce22-dccf-4aa9-8478-d24cf6db23c4?s=m -->
- **Curated by:** [Edward Nyameri ]
- **Funded by [optional]:** [Nil funding but any interested POC is welcome]
- **Shared by [optional]:** [Edward Nyameri ]
- **Language(s) (NLP):** [LLM]
- **License:** [MIT]
### Dataset Sources [optional]
<!-- schooly-Computer Breaches -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Threat Hunting for AI cyber Security Tool Kit -->
### Direct Use
<!-- application platform analysis for Threat Hunters-->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Advancement of the Threat Hunt using Computational Intelligence to curb & contain comprising of information -->
[More Information Needed]
### Source Data
<!-- 🏫 Computer Breaches-->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
varun-d/asdfasdfa | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713191473 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26387
num_examples: 72
download_size: 22992
dataset_size: 26387
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713191473"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Violence/Cloud | ---
license: afl-3.0
---
|
apsys/vc-pitches | ---
license: apache-2.0
---
|
mdacampora/tax-convos-sample2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: turns
list:
- name: role
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3823
num_examples: 5
download_size: 4907
dataset_size: 3823
---
# Dataset Card for "tax-convos-sample2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
genia-vdg/genia-dataset-01 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 833117.0
num_examples: 24
download_size: 323148
dataset_size: 833117.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_80_1713048912 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2782422
num_examples: 6746
download_size: 1395589
dataset_size: 2782422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CarperAI/openai_summarize_comparisons | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: test
num_bytes: 143018505
num_examples: 83629
- name: train
num_bytes: 157425966
num_examples: 92534
- name: valid1
num_bytes: 56686271
num_examples: 33082
- name: valid2
num_bytes: 86396487
num_examples: 50715
download_size: 20257716
dataset_size: 443527229
---
|
TinyPixel/f_2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1645034658
num_examples: 1000000
download_size: 950054945
dataset_size: 1645034658
---
# Dataset Card for "f_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Livingwithmachines/MapReader_Data_SIGSPATIAL_2022 | ---
annotations_creators:
- expert-generated
language:
- en
language_creators: []
license:
- cc-by-nc-sa-4.0
multilinguality: []
pretty_name: MapReader Data SIGSPATIAL 2022
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- maps
- historical
- National Library of Scotland
- heritage
- humanities
- lam
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
# Gold standards and outputs
## Dataset Description
- MapReader’s GitHub: https://github.com/Living-with-machines/MapReader
- MapReader paper: https://dl.acm.org/doi/10.1145/3557919.3565812
- Zenodo link for gold standards and outputs: https://doi.org/10.5281/zenodo.7147906
- Contacts: Katherine McDonough, The Alan Turing Institute, kmcdonough at turing.ac.uk; Kasra Hosseini, The Alan Turing Institute, k.hosseinizad at gmail.com
### Dataset Summary
Here we share gold standard annotations and outputs from early experiments using MapReader. MapReader creates datasets for humanities research using historical map scans and metadata as inputs.
Using maps provided by the National Library of Scotland, these annotations and outputs reflect labeling tasks relevant to historical research on the [Living with Machines](https://livingwithmachines.ac.uk/) project.
Data shared here is derived from maps printed in nineteenth-century Britain by the Ordnance Survey, Britain's state mapping agency. These maps cover England, Wales, and Scotland from 1888 to 1913.
## Directory structure
The gold standards and outputs are stored on [Zenodo](https://doi.org/10.5281/zenodo.7147906). It contains the following directories/files:
```
MapReader_Data_SIGSPATIAL_2022
├── README
├── annotations
│ ├── maps
│ │ ├── map_100942121.png
│ │ ├── ...
│ │ └── map_99383316.png
│ ├── slice_meters_100_100
│ │ ├── test
│ │ │ ├── patch-...PNG
│ │ │ ├── ...
│ │ │ └── patch-...PNG
│ │ ├── train
│ │ │ ├── patch-...PNG
│ │ │ ├── ...
│ │ │ └── patch-...PNG
│ │ └── val
│ │ ├── patch-...PNG
│ │ ├── ...
│ │ └── patch-...PNG
│ ├── test.csv
│ ├── train.csv
│ └── valid.csv
└── outputs
├── label_01_03
│ ├── pred_01_03_all.csv
│ ├── pred_01_03_keep_01_0250.csv
│ ├── pred_01_03_keep_05_0500.csv
│ └── pred_01_03_keep_10_1000.csv
├── label_02
│ ├── pred_02_all.csv
│ ├── pred_02_keep_01_0250.csv
│ ├── pred_02_keep_05_0500.csv
│ └── pred_02_keep_10_1000.csv
├── patches_all.csv
├── percentage
│ └── pred_02_keep_1_250_01_03_keep_1_250_percentage.csv
└── resources
├── StopsGB4paper.csv
└── six_inch4paper.json
```
## annotations
The `annotations` directory is as follows:
```
├── annotations
│ ├── maps
│ │ ├── map_100942121.png
│ │ ├── ...
│ │ └── map_99383316.png
│ ├── slice_meters_100_100
│ │ ├── test
│ │ │ ├── patch-...PNG
│ │ │ ├── ...
│ │ │ └── patch-...PNG
│ │ ├── train
│ │ │ ├── patch-...PNG
│ │ │ ├── ...
│ │ │ └── patch-...PNG
│ │ └── val
│ │ ├── patch-...PNG
│ │ ├── ...
│ │ └── patch-...PNG
│ ├── test.csv
│ ├── train.csv
│ └── valid.csv
```
### annotations/train.csv, valid.csv and test.csv
In the `MapReader_Data_SIGSPATIAL_2022/annotations` directory, there are three CSV files, namely `train.csv`, `valid.csv` and `test.csv`. These files have two columns:
```
image_id,label
slice_meters_100_100/train/patch-1390-3892-1529-4031-#map_101590193.png#.PNG,0
slice_meters_100_100/train/patch-1716-3960-1848-4092-#map_101439245.png#.PNG,0
...
```
in which:
- `image_id`: path to each labelled patch. For example in `slice_meters_100_100/train/patch-1390-3892-1529-4031-#map_101590193.png#.PNG`:
- `slice_meters_100_100/train`: directory where the patch is stored. (in this example, it is a patch used for training)
- `patch-1390-3892-1529-4031-#map_101590193.png#.PNG` has two parts itself: `patch-1390-3892-1529-4031` is the patch ID, and the patch itself is extracted from `map_101590193.png` map sheet.
- `label`: label assigned to each patch by an annotator.
- Labels: 0: no [building or railspace]; 1: railspace; 2: building; and 3: railspace and [non railspace] building.
### annotations/slice_meters_100_100
Patches used for training, validation, and test in PNG format.
```
├── annotations
│ ├── slice_meters_100_100
│ │ ├── test
│ │ │ ├── patch-...PNG
│ │ │ ├── ...
│ │ │ └── patch-...PNG
│ │ ├── train
│ │ │ ├── patch-...PNG
│ │ │ ├── ...
│ │ │ └── patch-...PNG
│ │ └── val
│ │ ├── patch-...PNG
│ │ ├── ...
│ │ └── patch-...PNG
```
### annotations/maps
Map sheets retrieved from the National Library of Scotland via webservers. These maps were later sliced into patches which can be found in `annotations/slice_meters_100_100`.
```
├── annotations
│ ├── maps
│ │ ├── map_100942121.png
│ │ ├── ...
│ │ └── map_99383316.png
```
## outputs
The `outputs` directory is as follows:
```
└── outputs
├── label_01_03
│ ├── pred_01_03_all.csv
│ ├── pred_01_03_keep_01_0250.csv
│ ├── pred_01_03_keep_05_0500.csv
│ └── pred_01_03_keep_10_1000.csv
├── label_02
│ ├── pred_02_all.csv
│ ├── pred_02_keep_01_0250.csv
│ ├── pred_02_keep_05_0500.csv
│ └── pred_02_keep_10_1000.csv
├── patches_all.csv
├── percentage
│ └── pred_02_keep_1_250_01_03_keep_1_250_percentage.csv
└── resources
├── StopsGB4paper.csv
└── six_inch4paper.json
```
### outputs/label_01_03
Starting with:
```
└── outputs
├── label_01_03
│ ├── pred_01_03_all.csv
│ ├── pred_01_03_keep_01_0250.csv
│ ├── pred_01_03_keep_05_0500.csv
│ └── pred_01_03_keep_10_1000.csv
```
The file `pred_01_03_all.csv` contains the following columns:
```
,center_lon,center_lat,pred,conf,mean_pixel_RGB,std_pixel_RGB,mean_pixel_A,image_id,parent_id,pub_date,url,x,y,z,opening_year_quicks,closing_year_quicks,dist2quicks
0,-0.4011055106547341,52.61260776720805,1,0.9898980855941772,0.8450341820716858,0.1668068021535873,1.0,patch-3014-0-3151-137-#map_100890251.png#.PNG,map_100890251.png,1902,https://maps.nls.uk/view/100890251,3880925.8529841416,-27169.29919979412,5044483.051365171,1867,1929,1121.9150481268305
1,-0.399645312864389,52.61260776720805,1,0.9999995231628418,0.823089599609375,0.1925655305385589,1.0,patch-3151-0-3288-137-#map_100890251.png#.PNG,map_100890251.png,1902,https://maps.nls.uk/view/100890251,3880926.544140446,-27070.392789791513,5044483.051365171,1867,1929,1113.0714735200893
...
```
- **center_lon**: longitude of the patch center
- **center_lat**: latitude of the patch center
- **pred**: predicted label for the patch
- **conf**: model confidence
- **mean_pixel_RGB**: mean pixel intensities, using all three channels
- **std_pixel_RGB**: standard deviations of pixel intensities, using all three channels
- **mean_pixel_A**: mean pixel intensities of alpha channel
- **image_id**: patch ID
- **parent_id**: ID of the map sheet that the patch belongs to
- **pub_date**: publication date of the map sheet that the patch belongs to
- **url**: URL of the map sheet that the patch belongs to
- **x, y, z**: to compute distances (using k-d tree)
- **opening_year_quicks**: Date when the railway station first opened
- **closing_year_quicks**: Date when the railway station last closed,
- **dist2quicks**: distance to the closest StopsGB in meters.
NB: See `outputs/resources` below for description of the StopsGB (railway station) data and links to related publications.
---
The other files in `outputs/label_01_03` have the same columns as `pred_01_03_all.csv` (described above). The difference is:
- `pred_01_03_all.csv`: all patches predicted as labels 1 (railspace) or 3 (railspace and [non railspace] building).
- `pred_01_03_keep_01_0250.csv`: similar to `pred_01_03_all.csv` except that we removed those patches that had no other neighboring patches with the same label within a radius of 250 meters. Note 01 and 0250 in the name. 01 means one neighboring patch and 0250 means 250 meters.
- `pred_01_03_keep_05_0500.csv`: similar to `pred_01_03_all.csv` except that we removed those patches that had less than five neighboring patches with the same label within a radius of 500 meters.
- `pred_01_03_keep_10_1000.csv`: similar to `pred_01_03_all.csv` except that we removed those patches that had less than ten neighboring patches with the same label within a radius of 1000 meters.
### outputs/label_02
Next, these files:
```
├── label_02
│ ├── pred_02_all.csv
│ ├── pred_02_keep_01_0250.csv
│ ├── pred_02_keep_05_0500.csv
│ └── pred_02_keep_10_1000.csv
```
Are the same as the files described above for `label_01_03` except for label 02 (i.e., building).
### outputs/patches_all.csv
And last:
```
└── outputs
├── patches_all.csv
```
The file `patches_all.csv` has the following columns:
⚠️ this file contains the results for 30,490,411 patches used in the MapReader paper.
```
center_lat,center_lon,pred
52.61260776720805,-0.4332298620423274,0
52.61260776720805,-0.4317696642519822,0
...
```
in which:
- **center_lon**: longitude of the patch center
- **center_lat**: latitude of the patch center
- **pred**: predicted label for the patch
### outputs/percentage
We have added one file in `outputs/percentage`:
```
└── outputs
├── percentage
│ └── pred_02_keep_1_250_01_03_keep_1_250_percentage.csv
```
This file has the following columns:
```
,center_lon,center_lat,pred,conf,mean_pixel_RGB,std_pixel_RGB,mean_pixel_A,image_id,parent_id,pub_date,url,x,y,z,dist2rail,dist2quicks,dist2quicks_km,dist2rail_km,dist2rail_minus_station,dist2quicks_km_quantized,dist2rail_km_quantized,dist2rail_minus_station_quantized,perc_neigh_rails,perc_neigh_builds,harmonic_mean_rail_build
0,-0.4040259062354244,52.61260776720805,2,0.9999010562896729,0.8095282316207886,0.1955385357141494,1.0,patch-2740-0-2877-137-#map_100890251.png#.PNG,map_100890251.png,1902,https://maps.nls.uk/view/100890251,3880924.4631095687,-27367.11196679585,5044483.051365171,197.8176497186437,1164.8640633870857,1.1648640633870857,0.1978176497186437,0.9670464136684418,1.0,0.0,0.5,7.198443579766536,4.669260700389105,5.664349046373668
1,-0.4054861040257695,52.61171342293056,2,0.9999876022338868,0.8741853833198547,0.1160899400711059,1.0,patch-2603-137-2740-274-#map_100890251.png#.PNG,map_100890251.png,1902,https://maps.nls.uk/view/100890251,3881002.836728637,-27466.57793328472,5044422.621073416,296.73252022623865,1290.9640259717814,1.2909640259717814,0.2967325202262386,0.9942315057455428,1.0,0.0,0.5,7.050092764378478,4.452690166975881,5.45813633371237
...
```
in which:
- **center_lon**: longitude of the patch center
- **center_lat**: latitude of the patch center
- **pred**: predicted label for the patch
- **conf**: model confidence
- **mean_pixel_RGB**: mean pixel intensities, using all three channels
- **std_pixel_RGB**: standard deviations of pixel intensities, using all three channels
- **mean_pixel_A**: mean pixel intensities of alpha channel
- **image_id**: patch ID
- **parent_id**: ID of the map sheet that the patch belongs to
- **pub_date**: publication date of the map sheet that the patch belongs to
- **url**: URL of the map sheet that the patch belongs to
- **x, y, z**: to compute distances (using k-d tree)
- **dist2rail**: distance to the closest railspace patch (i.e., the patch that is classified as 1: railspace or 3: railspace and [non railspace] building)
- **dist2quicks**: distance to the closest StopsGB station in meters.
- **dist2quicks_km**: distance to the closest StopsGB station in km.
- **dist2rail_km**: similar to **dist2rail** except in km.
- **dist2rail_minus_station**: | dist2rail_km - dist2quicks_km |
- **dist2quicks_km_quantized**: discrete version of **dist2quicks_km**, we used these intervals: [0. , 0.5), [0.5, 1.), [1., 1.5), ... , [4.5, 5.) and [5., inf).
- **dist2rail_km_quantized**: discrete version of **dist2rail_km**, we used these intervals: [0. , 0.5), [0.5, 1.), [1., 1.5), ... , [4.5, 5.) and [5., inf).
- **dist2rail_minus_station_quantized**: discrete version of **dist2rail_minus_station**, we used these intervals: [0. , 0.5), [0.5, 1.), [1., 1.5), ... , [4.5, 5.) and [5., inf).
- **perc_neigh_rails**: what is the percentage of neighboring patches predicted as rail (labels 01 and 03).
- **perc_neigh_builds**: what is the percentage of neighboring patches predicted as building (label 02).
- **harmonic_mean_rail_build**: Harmonic mean of *perc_neigh_rails* and **perc_neigh_builds**.
These additional `percentage` attributes shed light on the relationship between 'railspace' and stations, something we explore in further Living with Machines research.
### outputs/resources
Finally, we have the following files:
```
└── outputs
└── resources
├── StopsGB4paper.csv
└── six_inch4paper.json
```
- `StopsGB4paper.csv`: this is a trimmed down version of StopsGB, a dataset documenting passenger railway stations in Great Britain (see [this link](https://bl.iro.bl.uk/concern/datasets/0abea1b1-2a43-4422-ba84-39b354c8bb09?locale=en) for the complete dataset). We filtered the stations as follows:
- Keep only stations for which "ghost_entry" and "cross_ref" columns are "False". (These two fields help remove records in the StopsGB dataset that are not actually stations, but relics of the original publication formatting.)
- "Opening" was NOT "unknown".
- The map sheet was surveyed during a year when the station was operational (i.e., "opening_year_quicks" <= survey_date_of_map_sheet <= "closing_year_quicks").
You can learn more about the StopsGB dataset and how it was created from this paper:
```
Mariona Coll Ardanuy, Kaspar Beelen, Jon Lawrence, Katherine McDonough, Federico Nanni, Joshua Rhodes, Giorgia Tolfo, and Daniel C.S. Wilson. "Station to Station: Linking and Enriching Historical British Railway Data." In Computational Humanities Research (CHR2021). 2021.
```
```bibtex
@inproceedings{lwm-station-to-station-2021,
title = "Station to Station: Linking and Enriching Historical British Railway Data",
author = "Coll Ardanuy, Mariona and
Beelen, Kaspar and
Lawrence, Jon and
McDonough, Katherine and
Nanni, Federico and
Rhodes, Joshua and
Tolfo, Giorgia and
Wilson, Daniel CS",
booktitle = "Computational Humanities Research",
year = "2021",
}
```
- `six_inch4paper.json`: similar to [metadata_OS_Six_Inch_GB_WFS_light.json](https://github.com/Living-with-machines/MapReader/blob/main/mapreader/persistent_data/metadata_OS_Six_Inch_GB_WFS_light.json) on MapReader's GitHub with some minor changes.
## Dataset Creation
### Curation Rationale
These annotations of map patches are part of a research project to develop humanistic methods for structuring visual information on digitized historical maps. Dividing thousands of nineteenth-century map sheets into 100m x 100m patches and labeling those patches with historically-meaningful concepts diverges from traditional methods for creating data from maps, both in terms of scale (the number of maps being examined), and of type (raster-style patches vs. pixel-level vector data). For more on the rationale for this approach, see the following paper:
```
Kasra Hosseini, Katherine McDonough, Daniel van Strien, Olivia Vane, Daniel C S Wilson, Maps of a Nation? The Digitized Ordnance Survey for New Historical Research, *Journal of Victorian Culture*, Volume 26, Issue 2, April 2021, Pages 284–299.
```
```bibtex
@article{hosseini_maps_2021,
title = {Maps of a Nation? The Digitized Ordnance Survey for New Historical Research},
volume = {26},
rights = {All rights reserved},
issn = {1355-5502},
url = {https://doi.org/10.1093/jvcult/vcab009},
doi = {10.1093/jvcult/vcab009},
shorttitle = {Maps of a Nation?},
pages = {284--299},
number = {2},
journaltitle = {Journal of Victorian Culture},
author = {Hosseini, Kasra and {McDonough}, Katherine and van Strien, Daniel and Vane, Olivia and Wilson, Daniel C S},
urldate = {2021-05-19},
date = {2021-04-01},
}
```
### Source Data
#### Initial Data Access
Data was accessed via the National Library of Scotland's Historical Maps API: https://maps.nls.uk/projects/subscription-api/
The data shared here is derived from the six-inch to one mile sheets printed between 1888-1913: https://maps.nls.uk/projects/subscription-api/#gb6inch
### Annotations and Outputs
The annotations and output datasets collected here are related to experiments to identify the 'footprint' of rail infrastructure in the UK, a concept we call 'railspace'. We also created a dataset to identify buildings on the maps.
#### Annotation process
The custom annotation interface built into MapReader is designed specifically to assist researchers in labeling patches relevant to concepts of interest to their research questions.
Our **guidelines** for the data shared here were:
- for any non-null label (railspace, building, or railspace + building), if a patch contains any visual signal for that label (e.g. 'railspace'), it should be assigned the relevant label. For example, if it is possible for an annotator to see a railway track passing through the corner of a patch, that patch is labeled as 'railspace'.
- the context around the patch should not be used as an aid in extreme cases where it is nearly impossible to determine whether a patch contains a non-null label
- however, the patch context shown in the annotation interface can be used to quickly distinguish between different content types, particularly where the contiguity of a type across patches is useful in determining what label to assign
- for 'railspace': use this label for any type of rail infrastructure as determined by expert labelers. This includes, for example, single-track mining railroads; larger double-track passenger routes; sidings and embankments; etc. It excludes urban trams.
- for 'building': use this label for any size building
- for 'building + railspace': use this label for patches combining these two types of content
Because 'none' (e.g. null) patches made up the vast majority of patches in the total dataset from these map sheets, we ordered patches to annotate based on their pixel intensity. This allowed us to focus first on patches containing more visual content printed on the map sheet, and later to move more quickly through the patches that captured parts of the map with little to no printed features.
#### Who are the annotators?
Data shared here was annotated by Kasra Hosseini and Katherine McDonough.
Members of the Living with Machines research team contributed early annotations during the development of MapReader: Ruth Ahnert, Kaspar Beelen, Mariona Coll-Ardanuy, Emma Griffin, Tim Hobson, Jon Lawrence, Giorgia Tolfo, Daniel van Strien, Olivia Vane, and Daniel C.S. Wilson.
## Credits and re-use terms
### MapReader outputs
The files shared here (other than ```resources```) under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (https://creativecommons.org/licenses/by-nc-sa/4.0/) (CC-BY-NC-SA) licence.
If you are interested in working with OS maps used to create these results, please also note the re-use terms of the original map images and metadata detailed below.
### Digitized maps
MapReader can retrieve maps from NLS (National Library of Scotland) via webservers. For all the digitized maps (retrieved or locally stored), please note the re-use terms:
Use of the digitised maps for commercial purposes is currently restricted by contract. Use of these digitised maps for non-commercial purposes is permitted under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (https://creativecommons.org/licenses/by-nc-sa/4.0/) (CC-BY-NC-SA) licence. Please refer to https://maps.nls.uk/copyright.html#exceptions-os for details on copyright and re-use license.
### Map metadata
We have provided some metadata files in on MapReader’s GitHub page (https://github.com/Living-with-machines/MapReader/tree/main/mapreader/persistent_data). For all these file, please note the re-use terms:
Use of the digitised maps for commercial purposes is currently restricted by contract. Use of these digitised maps for non-commercial purposes is permitted under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (https://creativecommons.org/licenses/by-nc-sa/4.0/) (CC-BY-NC-SA) licence. Please refer to https://maps.nls.uk/copyright.html#exceptions-os for details on copyright and re-use license.
## Acknowledgements
This work was supported by Living with Machines (AHRC grant AH/S01179X/1) and The Alan Turing Institute (EPSRC grant EP/N510129/1).
Living with Machines, funded by the UK Research and Innovation (UKRI) Strategic Priority Fund, is a multidisciplinary collaboration delivered by the Arts and Humanities Research Council (AHRC), with The Alan Turing Institute, the British Library and the Universities of Cambridge, East Anglia, Exeter, and Queen Mary University of London. |
mask-distilled-one-sec-cv12/chunk_158 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1016276636
num_examples: 199583
download_size: 1035715202
dataset_size: 1016276636
---
# Dataset Card for "chunk_158"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraIndicLM/punjabi_alpaca_52K | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 46649317
num_examples: 52002
download_size: 18652304
dataset_size: 46649317
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## About
This repo contains a 52K instruction set for Punjabi, translated from Alpaca.
## Citation
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{PunjabiAlpaca,
author = {Sambit Sekhar and Shantipriya Parida},
title = {Punjabi Instruction Set Based on Alpaca},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
## License
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg
|
CyberHarem/higana_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of higana (Pokémon)
This is the dataset of higana (Pokémon), containing 223 images and their tags.
The core tags of this character are `black_hair, breasts, short_hair, red_eyes, dark_skin, large_breasts, short_ponytail, dark-skinned_female`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 223 | 198.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higana_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 223 | 121.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higana_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 540 | 254.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higana_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 223 | 181.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higana_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 540 | 337.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/higana_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/higana_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, nipples, blush, nude, smile, looking_at_viewer, navel, pussy |
| 1 | 21 |  |  |  |  |  | 1boy, 1girl, hetero, sex, solo_focus, vaginal, blush, sweat, nude, nipples, penis, girl_on_top, open_mouth, bar_censor, pussy, cowgirl_position |
| 2 | 5 |  |  |  |  |  | 1boy, 1girl, barefoot, blush, feet, hetero, penis, solo_focus, toes, mosaic_censoring, navel, smile, two-footed_footjob, nipples, sweat, bikini, cleavage, ejaculation, naked_cape, naked_cloak, nude, open_mouth, pov |
| 3 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, short_shorts, smile, blush, grey_thighhighs, solo, bangs, cloak, over-kneehighs, bare_shoulders, black_shirt, cleavage, grey_shorts, open_mouth, pokemon_(creature), simple_background, sleeveless_shirt, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | nipples | blush | nude | smile | looking_at_viewer | navel | pussy | 1boy | hetero | sex | solo_focus | vaginal | sweat | penis | girl_on_top | open_mouth | bar_censor | cowgirl_position | barefoot | feet | toes | mosaic_censoring | two-footed_footjob | bikini | cleavage | ejaculation | naked_cape | naked_cloak | pov | short_shorts | grey_thighhighs | bangs | cloak | over-kneehighs | bare_shoulders | black_shirt | grey_shorts | pokemon_(creature) | simple_background | sleeveless_shirt | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------|:-------|:--------|:--------------------|:--------|:--------|:-------|:---------|:------|:-------------|:----------|:--------|:--------|:--------------|:-------------|:-------------|:-------------------|:-----------|:-------|:-------|:-------------------|:---------------------|:---------|:-----------|:--------------|:-------------|:--------------|:------|:---------------|:------------------|:--------|:--------|:-----------------|:-----------------|:--------------|:--------------|:---------------------|:--------------------|:-------------------|:-------------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 21 |  |  |  |  |  | X | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | X | | X | | X | X | | X | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | | | | | X | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
Imran1/dogbalance_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Australian_shepherd
'1': Chihuahua
'2': French_bulldog
splits:
- name: train
num_bytes: 18102241.0
num_examples: 735
download_size: 18093424
dataset_size: 18102241.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/VALUE_cola_negative_inversion | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 368
num_examples: 3
- name: test
num_bytes: 166
num_examples: 1
- name: train
num_bytes: 596
num_examples: 8
download_size: 7183
dataset_size: 1130
---
# Dataset Card for "VALUE_cola_negative_inversion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jason-lee08/TinyStoriesWithExclamations | ---
dataset_info:
features:
- name: text
struct:
- name: attention_mask
sequence: int64
- name: input_ids
sequence: int64
splits:
- name: train
num_bytes: 7567697960
num_examples: 2119719
- name: validation
num_bytes: 76086656
num_examples: 21990
download_size: 816196858
dataset_size: 7643784616
---
# Dataset Card for "TinyStoriesWithExclamations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lokesh2002/txt2txt | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload | ---
pretty_name: Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-14T22:54:23.964972](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload/blob/main/results_2023-10-14T22-54-23.964972.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.00046850650303681895,\n \"f1\": 0.05818162751677859,\n\
\ \"f1_stderr\": 0.0013245165484434952,\n \"acc\": 0.4407302535404773,\n\
\ \"acc_stderr\": 0.01044050090848239\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303681895,\n\
\ \"f1\": 0.05818162751677859,\n \"f1_stderr\": 0.0013245165484434952\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \
\ \"acc_stderr\": 0.008919702911161629\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|arc:challenge|25_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_14T22_54_23.964972
path:
- '**/details_harness|drop|3_2023-10-14T22-54-23.964972.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-14T22-54-23.964972.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_14T22_54_23.964972
path:
- '**/details_harness|gsm8k|5_2023-10-14T22-54-23.964972.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-14T22-54-23.964972.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hellaswag|10_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T12:50:25.764084.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-16T12:50:25.764084.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_14T22_54_23.964972
path:
- '**/details_harness|winogrande|5_2023-10-14T22-54-23.964972.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-14T22-54-23.964972.parquet'
- config_name: results
data_files:
- split: 2023_08_16T12_50_25.764084
path:
- results_2023-08-16T12:50:25.764084.parquet
- split: 2023_10_14T22_54_23.964972
path:
- results_2023-10-14T22-54-23.964972.parquet
- split: latest
path:
- results_2023-10-14T22-54-23.964972.parquet
---
# Dataset Card for Evaluation run of Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload](https://huggingface.co/Lajonbot/Llama-2-13b-hf-instruct-pl-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T22:54:23.964972](https://huggingface.co/datasets/open-llm-leaderboard/details_Lajonbot__Llama-2-13b-hf-instruct-pl-lora_unload/blob/main/results_2023-10-14T22-54-23.964972.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303681895,
"f1": 0.05818162751677859,
"f1_stderr": 0.0013245165484434952,
"acc": 0.4407302535404773,
"acc_stderr": 0.01044050090848239
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303681895,
"f1": 0.05818162751677859,
"f1_stderr": 0.0013245165484434952
},
"harness|gsm8k|5": {
"acc": 0.11902956785443518,
"acc_stderr": 0.008919702911161629
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
axiong/pmc_oa_demo | ---
license: openrail
---
|
open-llm-leaderboard/details_jpquiroga__Mistral_7B_ties_merge_instruct_open_orca_codeninja | ---
pretty_name: Evaluation run of jpquiroga/Mistral_7B_ties_merge_instruct_open_orca_codeninja
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jpquiroga/Mistral_7B_ties_merge_instruct_open_orca_codeninja](https://huggingface.co/jpquiroga/Mistral_7B_ties_merge_instruct_open_orca_codeninja)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jpquiroga__Mistral_7B_ties_merge_instruct_open_orca_codeninja\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T16:23:04.591753](https://huggingface.co/datasets/open-llm-leaderboard/details_jpquiroga__Mistral_7B_ties_merge_instruct_open_orca_codeninja/blob/main/results_2024-04-15T16-23-04.591753.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.593791278114062,\n\
\ \"acc_stderr\": 0.033151969992860686,\n \"acc_norm\": 0.5976125048945863,\n\
\ \"acc_norm_stderr\": 0.033824061944462476,\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5878025444934631,\n\
\ \"mc2_stderr\": 0.015592863615568177\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182524,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6026687910774746,\n\
\ \"acc_stderr\": 0.004883455188908961,\n \"acc_norm\": 0.7994423421629158,\n\
\ \"acc_norm_stderr\": 0.003995992960088757\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849726,\n\
\ \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849726\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936336,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936336\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411898,\n \"\
acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411898\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"\
acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042345,\n\
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042345\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478465,\n \
\ \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7779816513761468,\n \"acc_stderr\": 0.01781884956479664,\n \"\
acc_norm\": 0.7779816513761468,\n \"acc_norm_stderr\": 0.01781884956479664\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.02636165166838909,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.02636165166838909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709588,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709588\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277895,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277895\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.015318257745976711,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.015318257745976711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156844,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156844\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200865,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200865\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.02918980567358708,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.02918980567358708\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n\
\ \"acc_stderr\": 0.012625879884892001,\n \"acc_norm\": 0.42503259452411996,\n\
\ \"acc_norm_stderr\": 0.012625879884892001\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5878025444934631,\n\
\ \"mc2_stderr\": 0.015592863615568177\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224174\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42608036391205456,\n \
\ \"acc_stderr\": 0.01362114439608671\n }\n}\n```"
repo_url: https://huggingface.co/jpquiroga/Mistral_7B_ties_merge_instruct_open_orca_codeninja
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|arc:challenge|25_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|gsm8k|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hellaswag|10_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-23-04.591753.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T16-23-04.591753.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- '**/details_harness|winogrande|5_2024-04-15T16-23-04.591753.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T16-23-04.591753.parquet'
- config_name: results
data_files:
- split: 2024_04_15T16_23_04.591753
path:
- results_2024-04-15T16-23-04.591753.parquet
- split: latest
path:
- results_2024-04-15T16-23-04.591753.parquet
---
# Dataset Card for Evaluation run of jpquiroga/Mistral_7B_ties_merge_instruct_open_orca_codeninja
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jpquiroga/Mistral_7B_ties_merge_instruct_open_orca_codeninja](https://huggingface.co/jpquiroga/Mistral_7B_ties_merge_instruct_open_orca_codeninja) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jpquiroga__Mistral_7B_ties_merge_instruct_open_orca_codeninja",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T16:23:04.591753](https://huggingface.co/datasets/open-llm-leaderboard/details_jpquiroga__Mistral_7B_ties_merge_instruct_open_orca_codeninja/blob/main/results_2024-04-15T16-23-04.591753.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.593791278114062,
"acc_stderr": 0.033151969992860686,
"acc_norm": 0.5976125048945863,
"acc_norm_stderr": 0.033824061944462476,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5878025444934631,
"mc2_stderr": 0.015592863615568177
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182524,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.6026687910774746,
"acc_stderr": 0.004883455188908961,
"acc_norm": 0.7994423421629158,
"acc_norm_stderr": 0.003995992960088757
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849726,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849726
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936336,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936336
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411898,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042345,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042345
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945273,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945273
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7779816513761468,
"acc_stderr": 0.01781884956479664,
"acc_norm": 0.7779816513761468,
"acc_norm_stderr": 0.01781884956479664
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.02636165166838909,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.02636165166838909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709588,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277895,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277895
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976711,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156844,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156844
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200865,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200865
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.02918980567358708,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.02918980567358708
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42503259452411996,
"acc_stderr": 0.012625879884892001,
"acc_norm": 0.42503259452411996,
"acc_norm_stderr": 0.012625879884892001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5878025444934631,
"mc2_stderr": 0.015592863615568177
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224174
},
"harness|gsm8k|5": {
"acc": 0.42608036391205456,
"acc_stderr": 0.01362114439608671
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
JbIPS/stanford-dogs | ---
license: mit
---
|
yiyic/MTG_QG | ---
task_categories:
- text-generation
- question-answering
size_categories:
- 10K<n<100K
--- |
huangyt/FINETUNE4_TEST | ---
license: openrail
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1706e1cd | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1339
dataset_size: 188
---
# Dataset Card for "1706e1cd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsstein/25-percent-human-dataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated
dtype: bool
splits:
- name: train
num_bytes: 86221172
num_examples: 15326
- name: test
num_bytes: 3062111
num_examples: 576
- name: validation
num_bytes: 3258681
num_examples: 576
download_size: 57267649
dataset_size: 92541964
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
galman33/gal_yair_83000_100x100 | ---
dataset_info:
features:
- name: lat
dtype: float64
- name: lon
dtype: float64
- name: country_code
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 1423239502.0
num_examples: 83000
download_size: 1423108777
dataset_size: 1423239502.0
---
# Dataset Card for "gal_yair_83000_100x100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hayashimo_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hayashimo/早霜/早霜 (Kantai Collection)
This is the dataset of hayashimo/早霜/早霜 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `black_hair, long_hair, hair_over_one_eye, ribbon, very_long_hair, hair_ribbon, bow, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 428.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayashimo_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 281.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayashimo_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1071 | 558.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayashimo_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 390.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayashimo_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1071 | 723.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayashimo_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hayashimo_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, simple_background, sitting, white_background, barefoot, medium_breasts, navel, nude, purple_eyes, smile, bikini, yellow_eyes |
| 1 | 6 |  |  |  |  |  | 1girl, black_bra, black_panties, solo, looking_at_viewer, navel, simple_background, small_breasts, underwear_only, blush, white_background, cleavage |
| 2 | 6 |  |  |  |  |  | 1girl, blush, looking_at_viewer, open_shirt, solo, black_panties, large_breasts, open_mouth, purple_eyes, grey_pantyhose, navel, nipples, no_bra, smile, white_shirt, dakimakura_(medium), full_body, on_back, skirt |
| 3 | 19 |  |  |  |  |  | 1girl, bowtie, school_uniform, solo, white_shirt, long_sleeves, looking_at_viewer, purple_dress, halterneck, simple_background, white_background, white_ribbon, cowboy_shot |
| 4 | 9 |  |  |  |  |  | 1girl, bowtie, grey_pantyhose, looking_at_viewer, school_uniform, solo, white_shirt, lace-up_boots, simple_background, long_sleeves, white_background, full_body, skirt, smile |
| 5 | 5 |  |  |  |  |  | 1girl, bowtie, holding_umbrella, long_sleeves, looking_at_viewer, school_uniform, solo, white_shirt, rain, smile, sleeveless_dress, hydrangea, upper_body |
| 6 | 14 |  |  |  |  |  | 1girl, enmaided, solo, black_dress, maid_headdress, looking_at_viewer, simple_background, maid_apron, smile, white_apron, long_sleeves, white_background, frilled_apron, puffy_sleeves, twitter_username, blush, one-hour_drawing_challenge, breasts, dated, pantyhose, purple_eyes |
| 7 | 21 |  |  |  |  |  | playboy_bunny, 1girl, bowtie, detached_collar, fake_animal_ears, rabbit_ears, wrist_cuffs, solo, strapless_leotard, purple_leotard, looking_at_viewer, rabbit_tail, grey_pantyhose, simple_background, fishnet_pantyhose, adapted_costume, small_breasts, blush, cleavage, white_background |
| 8 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, solo_focus, fellatio, medium_breasts, open_mouth, closed_eyes, mosaic_censoring, nipples, shirt, bowtie, breasts_out, cum_on_tongue, facial, male_pubic_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | simple_background | sitting | white_background | barefoot | medium_breasts | navel | nude | purple_eyes | smile | bikini | yellow_eyes | black_bra | black_panties | small_breasts | underwear_only | cleavage | open_shirt | large_breasts | open_mouth | grey_pantyhose | nipples | no_bra | white_shirt | dakimakura_(medium) | full_body | on_back | skirt | bowtie | school_uniform | long_sleeves | purple_dress | halterneck | white_ribbon | cowboy_shot | lace-up_boots | holding_umbrella | rain | sleeveless_dress | hydrangea | upper_body | enmaided | black_dress | maid_headdress | maid_apron | white_apron | frilled_apron | puffy_sleeves | twitter_username | one-hour_drawing_challenge | breasts | dated | pantyhose | playboy_bunny | detached_collar | fake_animal_ears | rabbit_ears | wrist_cuffs | strapless_leotard | purple_leotard | rabbit_tail | fishnet_pantyhose | adapted_costume | 1boy | hetero | penis | solo_focus | fellatio | closed_eyes | mosaic_censoring | shirt | breasts_out | cum_on_tongue | facial | male_pubic_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------------------|:----------|:-------------------|:-----------|:-----------------|:--------|:-------|:--------------|:--------|:---------|:--------------|:------------|:----------------|:----------------|:-----------------|:-----------|:-------------|:----------------|:-------------|:-----------------|:----------|:---------|:--------------|:----------------------|:------------|:----------|:--------|:---------|:-----------------|:---------------|:---------------|:-------------|:---------------|:--------------|:----------------|:-------------------|:-------|:-------------------|:------------|:-------------|:-----------|:--------------|:-----------------|:-------------|:--------------|:----------------|:----------------|:-------------------|:-----------------------------|:----------|:--------|:------------|:----------------|:------------------|:-------------------|:--------------|:--------------|:--------------------|:-----------------|:--------------|:--------------------|:------------------|:-------|:---------|:--------|:-------------|:-----------|:--------------|:-------------------|:--------|:--------------|:----------------|:---------|:------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | X | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | | | | | X | | X | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 19 |  |  |  |  |  | X | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | | | | X | | | | | | | | | | | X | | | X | | X | | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | X | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | X | X | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 14 |  |  |  |  |  | X | X | X | X | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 21 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | | | | | | X | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | X | | | | | | X | | | | | | | | | | | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
edarchimbaud/perimeter-stocks | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: security
dtype: string
- name: gics_sector
dtype: string
- name: gics_sub_industry
dtype: string
splits:
- name: train
num_bytes: 111992
num_examples: 1500
download_size: 44216
dataset_size: 111992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "perimeter-stocks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yleo/emerton_dpo_pairs | ---
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 15630325.68343667
num_examples: 5489
download_size: 9101980
dataset_size: 15630325.68343667
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This dataset is similar to [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs) with slightly less entries and replacement of GPT3.5 answer by GPT4 Turbo answers. |
valdineiarcenio/oigente | ---
license: openrail
---
|
xhiroga/MiniAlbum | ---
license: mit
---
|
arieg/siamese_clusters | ---
dataset_info:
features:
- name: image
dtype: image
- name: track_id
dtype:
class_label:
names:
'0': '000002'
'1': '000005'
'2': '000010'
'3': '000140'
'4': '000141'
'5': 000148
'6': 000182
'7': 000190
'8': 000193
'9': 000194
'10': 000197
'11': '000200'
'12': '000203'
'13': '000204'
'14': '000207'
'15': '000210'
'16': '000211'
'17': '000212'
'18': '000213'
'19': '000255'
'20': '000256'
'21': 000368
'22': '000424'
'23': 000459
'24': '000534'
'25': '000540'
'26': '000546'
'27': '000574'
'28': '000602'
'29': '000615'
'30': '000620'
'31': '000621'
'32': '000625'
'33': '000666'
'34': '000667'
'35': '000676'
'36': 000690
'37': 000694
'38': 000695
'39': '000704'
'40': '000705'
'41': '000706'
'42': '000707'
'43': 000708
'44': 000709
'45': '000714'
'46': '000715'
'47': '000716'
'48': 000718
'49': '000777'
'50': 000814
'51': 000821
'52': 000822
'53': 000825
'54': 000853
'55': 000890
'56': 000892
'57': 000897
'58': 000993
'59': 000995
'60': 000997
'61': 000998
'62': 001039
'63': '001040'
'64': '001066'
'65': 001069
'66': '001073'
'67': '001075'
'68': 001082
'69': 001083
'70': 001087
'71': '001102'
'72': 001193
'73': 001195
'74': 001196
'75': 001197
'76': 001249
'77': 001259
'78': '001270'
'79': '001276'
'80': '001277'
'81': 001278
'82': '001417'
'83': '001427'
'84': '001443'
'85': 001482
'86': '001510'
'87': '001544'
'88': '001642'
'89': '001644'
'90': 001649
'91': '001661'
'92': '001663'
'93': '001666'
'94': '001673'
'95': 001680
'96': 001681
'97': 001682
'98': 001683
'99': 001684
'100': 001685
'101': 001686
'102': 001687
'103': 001688
'104': 001689
'105': '001701'
'106': '001702'
'107': '001703'
'108': '001704'
'109': '001706'
'110': '001720'
'111': '001732'
'112': '001733'
'113': '001735'
'114': '001736'
'115': 001883
'116': 001891
'117': 001893
'118': 001924
'119': 001925
'120': 001929
'121': 001930
'122': '002012'
'123': 002096
'124': 002097
'125': 002099
'126': '003263'
'127': '003264'
'128': '003265'
'129': '003266'
'130': '003270'
'131': '003271'
'132': '003272'
'133': '003273'
'134': '003274'
'135': 003492
'136': '003532'
'137': '003533'
'138': '003534'
'139': '003535'
'140': '003537'
'141': 003538
'142': '003573'
'143': 003598
'144': '003624'
'145': '003707'
'146': 003708
'147': '003720'
'148': '003721'
'149': '003722'
'150': '003724'
'151': '003725'
'152': '003761'
'153': '003762'
'154': '003763'
'155': '003765'
'156': '003766'
'157': '003775'
'158': '003776'
'159': '003777'
'160': 003778
'161': 003779
'162': 003832
'163': 003833
'164': 003840
'165': 003880
'166': 003895
'167': 003896
'168': 003904
'169': 003905
'170': 003906
'171': 003908
'172': 003909
'173': 003910
'174': 003911
'175': 003912
'176': 003913
'177': 003920
'178': 003921
'179': 003950
'180': '004013'
'181': '004017'
'182': '004022'
'183': '004037'
'184': '004066'
'185': '004067'
'186': 004068
'187': 004069
'188': '004070'
'189': '004071'
'190': '004072'
'191': '004073'
'192': '004074'
'193': '004075'
'194': '004076'
'195': '004077'
'196': 004078
'197': 004079
'198': 004080
'199': 004091
'200': 004092
'201': 004093
'202': 004094
'203': 004095
'204': 004096
'205': 004097
'206': 004098
'207': 004099
'208': '004100'
'209': '004101'
'210': '004102'
'211': '004103'
'212': 004108
'213': '004232'
'214': '004233'
'215': '004234'
'216': '004235'
'217': '004236'
'218': 004239
'219': '004450'
'220': '004507'
'221': 004508
'222': 004509
'223': '004510'
'224': '004511'
'225': 004519
'226': '004520'
'227': '004521'
'228': '004522'
'229': 004682
'230': 004684
'231': 004685
'232': 004688
'233': '004777'
'234': 004778
'235': 004779
'236': 004780
'237': 004781
'238': 004782
'239': 004784
'240': 004785
'241': 004786
'242': 004787
'243': 004788
'244': 004799
'245': 004835
'246': 004836
'247': 004838
'248': 004846
'249': 004848
'250': 004849
'251': '005006'
'252': '005156'
'253': '005157'
'254': 005158
'255': 005159
'256': 005169
'257': '005170'
'258': '005171'
'259': 005191
'260': '005264'
'261': 005268
'262': '005376'
'263': 005381
'264': '005521'
'265': 005879
'266': 005936
'267': 005940
'268': 006329
'269': '006330'
'270': '006331'
'271': '006332'
'272': '006333'
'273': '006342'
'274': '006354'
'275': '006357'
'276': 006358
'277': '006360'
'278': '006363'
'279': '006366'
'280': '006367'
'281': 006368
'282': '006370'
'283': '006372'
'284': '006373'
'285': '006376'
'286': 006379
'287': 006380
'288': 006381
'289': 006382
'290': 006383
'291': 006385
'292': 006387
'293': 006389
'294': 006390
'295': 006393
'296': 006394
'297': 006396
'298': '006406'
'299': '006407'
'300': 006439
'301': '006440'
'302': '006442'
'303': '006443'
'304': 006448
'305': 006459
'306': '006461'
'307': '006463'
'308': '006467'
'309': 006469
'310': '006517'
'311': 006519
'312': '006603'
'313': '006605'
'314': '006606'
'315': '006607'
'316': 006608
'317': 006609
'318': '006610'
'319': '006611'
'320': '006674'
'321': '006675'
'322': '006677'
'323': 006679
'324': 006680
'325': 006684
'326': '006762'
'327': '006776'
'328': 006778
'329': 006779
'330': 006782
'331': 006783
'332': 006788
'333': 006802
'334': 006803
'335': 006854
'336': 006855
'337': 006856
'338': 006857
'339': '007011'
'340': '007373'
'341': '007374'
'342': '007375'
'343': '007376'
'344': '007377'
'345': 007378
'346': 007379
'347': 007381
'348': 007383
'349': 007385
'350': 007386
'351': 007388
'352': 007391
'353': 007393
'354': 007481
'355': 007482
'356': 007483
'357': 007487
'358': 007488
'359': 007489
'360': 007490
'361': 007491
'362': 007492
'363': 007495
'364': '007526'
'365': '007527'
'366': 007528
'367': 007529
'368': 007548
'369': '007554'
'370': 007709
'371': '007710'
'372': '007711'
'373': '007712'
'374': '007713'
'375': 007872
'376': 008056
'377': 008208
'378': 008256
'379': 008259
'380': 008261
'381': 008345
'382': 008357
'383': 008363
'384': 008372
'385': 008416
'386': 009152
'387': 009155
'388': 009307
'389': 009476
'390': 009477
'391': 009491
'392': 009505
'393': 009511
'394': 009512
'395': 009513
'396': 009550
'397': 009553
'398': 009555
'399': 009557
'400': 009559
'401': 009560
'402': 009678
'403': 009721
'404': 009846
'405': 009887
'406': 009888
'407': 009918
'408': 009962
'409': 010186
'410': 010192
'411': '010250'
'412': '010374'
'413': '010375'
'414': '010376'
'415': '010377'
'416': 010381
'417': 010382
'418': 010383
'419': 010384
'420': 010385
'421': 010386
'422': 010387
'423': 010388
'424': 010389
'425': '010435'
'426': 010438
'427': 010439
'428': '010440'
'429': '010441'
'430': '010442'
'431': '010443'
'432': '010444'
'433': '010447'
'434': 010458
'435': 010480
'436': 010481
'437': 010485
'438': '010521'
'439': '010527'
'440': '010535'
'441': '010541'
'442': '010575'
'443': '010577'
'444': 010668
'445': 010669
'446': '010670'
'447': '010671'
'448': '010672'
'449': '010673'
'450': '010674'
'451': '010675'
'452': '010676'
'453': '010677'
'454': 010678
'455': 010679
'456': 010682
'457': 010684
'458': 010693
'459': 010694
'460': 010695
'461': 010696
'462': 010697
'463': 010698
'464': 010699
'465': 010805
'466': 010806
'467': 010807
'468': 010808
'469': 010809
'470': 010810
'471': 010983
'472': 010992
'473': 010993
'474': 011019
'475': '011020'
'476': 011059
'477': 011198
'478': 011199
'479': '011200'
'480': '011204'
'481': '011206'
'482': '011234'
'483': '011237'
'484': 011239
'485': '011242'
'486': '011261'
'487': '011262'
'488': '011264'
'489': 011268
'490': 011298
'491': 011299
'492': '011306'
'493': '011333'
'494': '011334'
'495': '011503'
'496': '011504'
'497': '011505'
'498': 011508
'499': '011544'
'500': 011638
'501': '011671'
'502': '011672'
'503': '011673'
'504': '011674'
'505': '011675'
'506': '011677'
'507': 011679
'508': 011681
'509': 011682
'510': 011683
'511': '011763'
'512': '011764'
'513': '011765'
'514': '011766'
'515': '011767'
'516': 011768
'517': 011769
'518': '011770'
'519': '011771'
'520': '011772'
'521': '011773'
'522': '011774'
'523': '011775'
'524': '011776'
'525': '011777'
'526': 011778
'527': 011779
'528': 011780
'529': 011781
'530': 011782
'531': 011783
'532': 011784
'533': 011785
'534': 011786
'535': 011787
'536': 011788
'537': 011789
'538': 011790
'539': 011791
'540': 011792
'541': 011793
'542': 011794
'543': 011795
'544': 011803
'545': 011818
'546': 011839
'547': 011861
'548': 011862
'549': 011867
'550': 011868
'551': 011916
'552': 011917
'553': 011918
'554': 011919
'555': 011920
'556': 011921
'557': 011922
'558': 011933
'559': 011937
'560': 011942
'561': 011946
'562': 011947
'563': 011951
'564': '012045'
'565': '012046'
'566': '012047'
'567': 012048
'568': 012049
'569': '012050'
'570': '012051'
'571': '012052'
'572': '012053'
'573': 012058
'574': 012059
'575': '012060'
'576': '012061'
'577': '012062'
'578': '012064'
'579': '012065'
'580': '012066'
'581': '012067'
'582': 012109
'583': '012146'
'584': '012147'
'585': '012173'
'586': '012174'
'587': 012179
'588': 012188
'589': 012189
'590': '012346'
'591': 012348
'592': 012349
'593': '012350'
'594': '012351'
'595': '012352'
'596': '012353'
'597': '012355'
'598': '012376'
'599': 012387
'600': 012390
'601': 012394
'602': 012481
'603': 012482
'604': 012484
'605': 012485
'606': 012486
'607': 012487
'608': 012488
'609': 012489
'610': 012490
'611': 012508
'612': '012513'
'613': '012514'
'614': 012518
'615': '012521'
'616': '012526'
'617': '012527'
'618': '012530'
'619': '012531'
'620': '012532'
'621': '012537'
'622': '012551'
'623': '012552'
'624': '012654'
'625': 012690
'626': 012691
'627': 012692
'628': '012737'
'629': 012985
'630': 012986
'631': 013191
'632': 013197
'633': 013199
'634': '013201'
'635': 013218
'636': '013220'
'637': '013325'
'638': 013328
'639': '013362'
'640': 013378
'641': '013474'
'642': '013537'
'643': 013538
'644': 013539
'645': '013540'
'646': '013556'
'647': '013561'
'648': '013562'
'649': '013566'
'650': '013571'
'651': 013578
'652': 013591
'653': 013596
'654': '013666'
'655': 013668
'656': '013670'
'657': '013706'
'658': '013707'
'659': 013708
'660': 013709
'661': '013710'
'662': '013711'
'663': '013735'
'664': '013747'
'665': 013748
'666': 013749
'667': '013767'
'668': 013768
'669': 013804
'670': 013927
'671': 013928
'672': 013929
'673': 013930
'674': '014063'
'675': 014208
'676': '014315'
'677': '014316'
'678': '014317'
'679': 014318
'680': 014319
'681': '014320'
'682': '014344'
'683': 014358
'684': '014363'
'685': '014365'
'686': 014386
'687': 014391
'688': 014538
'689': 014539
'690': '014541'
'691': '014542'
'692': 014568
'693': 014569
'694': '014570'
'695': '014571'
'696': '014572'
'697': '014576'
'698': '014577'
'699': 014578
'700': 014579
'701': 014580
'702': 014581
'703': 014583
'704': 014584
'705': 014585
'706': 014586
'707': 014588
'708': 014589
'709': 014590
'710': '014601'
'711': '014602'
'712': '014603'
'713': '014604'
'714': '014653'
'715': '014661'
'716': '014663'
'717': 014684
'718': 014690
'719': 014693
'720': '014733'
'721': '014734'
'722': '014735'
'723': '014736'
'724': '014737'
'725': 014738
'726': 014739
'727': '014740'
'728': '014741'
'729': '014742'
'730': '014743'
'731': '014744'
'732': '014745'
'733': 014809
'734': 014869
'735': 015094
'736': '015210'
'737': '015464'
'738': 015469
'739': '015471'
'740': '015475'
'741': '015476'
'742': 015487
'743': 015488
'744': '015540'
'745': '015541'
'746': '015542'
'747': '015543'
'748': '015625'
'749': 015769
'750': '015770'
'751': '015771'
'752': '015772'
'753': '015773'
'754': 015880
'755': 016095
'756': '016155'
'757': 016158
'758': '016162'
'759': '016163'
'760': '016334'
'761': '016337'
'762': 016338
'763': 016339
'764': '016340'
'765': '016354'
'766': '016743'
'767': '016744'
'768': '016745'
'769': '016747'
'770': 016819
'771': 016820
'772': 016821
'773': 016822
'774': 016878
'775': 016879
'776': 016880
'777': 016895
'778': 016994
'779': 016995
'780': 016997
'781': '017132'
'782': '017344'
'783': '017345'
'784': '017462'
'785': 017491
'786': 017496
'787': 017499
'788': '017500'
'789': '017573'
'790': 017588
'791': '017605'
'792': '017606'
'793': '017607'
'794': 017608
'795': 017609
'796': '017610'
'797': '017611'
'798': '017631'
'799': '017632'
'800': '017633'
'801': '017634'
'802': '017635'
'803': '017636'
'804': '017637'
'805': '017644'
'806': '017735'
'807': 017782
'808': 017884
'809': 017906
'810': 018031
'811': 018032
'812': 018033
'813': 018034
'814': 018037
'815': 018038
'816': 018039
'817': 018043
'818': 018044
'819': 018112
'820': 018124
'821': 018144
'822': 018145
'823': 018146
'824': 018159
'825': 018197
'826': 018350
'827': 018607
'828': 018611
'829': 018876
'830': 018877
'831': 018887
'832': 019073
'833': 019074
'834': 019179
'835': 019184
'836': 019187
'837': 019192
'838': 019412
'839': 019413
'840': 019415
'841': 019416
'842': 019417
'843': 019418
'844': 019420
'845': 019422
'846': 019423
'847': 019425
'848': 019438
'849': 019439
'850': 019441
'851': 019442
'852': 019459
'853': 019673
'854': 019674
'855': 019685
'856': 019689
'857': 019707
'858': 019708
'859': 019729
'860': 019758
'861': 019759
'862': 019760
'863': 019889
'864': 019890
'865': 019891
'866': '020050'
'867': 020296
'868': '020361'
'869': '020362'
'870': '020364'
'871': '020365'
'872': '020366'
'873': 020369
'874': '020372'
'875': '020373'
'876': '020374'
'877': '020375'
'878': '020376'
'879': '020424'
'880': '020432'
'881': 020469
'882': '020667'
'883': '020704'
'884': 020818
'885': 021058
'886': 021085
'887': 021087
'888': '021167'
'889': 021228
'890': '021231'
'891': '021232'
'892': '021400'
'893': '021401'
'894': '021402'
'895': '021403'
'896': '021404'
'897': 021409
'898': '021422'
'899': '021565'
'900': 021587
'901': '021657'
'902': '021672'
'903': '021676'
'904': '021677'
'905': '021707'
'906': '021774'
'907': 021842
'908': 021859
'909': 021860
'910': 021891
'911': 021895
'912': 021995
'913': 021996
'914': 021997
'915': 021998
'916': 021999
'917': '022000'
'918': '022001'
'919': 022088
'920': 022091
'921': 022093
'922': 022094
'923': 022095
'924': 022097
'925': '022150'
'926': 022295
'927': 022296
'928': '022315'
'929': 022348
'930': '022472'
'931': '022473'
'932': '022474'
'933': '022475'
'934': '022476'
'935': '022477'
'936': 022478
'937': 022479
'938': 022480
'939': 022481
'940': '023010'
'941': '023013'
'942': '023014'
'943': '023015'
'944': '023016'
'945': '023037'
'946': 023039
'947': '023041'
'948': '023063'
'949': '023155'
'950': '023156'
'951': '023172'
'952': 023329
'953': '023353'
'954': '023355'
'955': '023371'
'956': '023372'
'957': '023505'
'958': 023862
'959': '024216'
'960': '024217'
'961': 024218
'962': '024362'
'963': '024363'
'964': '024364'
'965': '024365'
'966': '024366'
'967': '024367'
'968': 024368
'969': 024369
'970': '024370'
'971': '024371'
'972': 024418
'973': '024420'
'974': '024421'
'975': '024422'
'976': '024423'
'977': '024424'
'978': '024425'
'979': '024426'
'980': '024427'
'981': 024428
'982': 024429
'983': '024430'
'984': '024431'
'985': '024432'
'986': '024512'
'987': '024515'
'988': '024521'
'989': '024524'
'990': 024698
'991': 024699
'992': '024700'
'993': '024701'
'994': '024702'
'995': '024717'
'996': '024720'
'997': 024739
'998': '024741'
'999': '024742'
'1000': '024745'
'1001': '024746'
'1002': '024747'
'1003': 024748
'1004': 024749
'1005': 024842
'1006': 024898
'1007': 024899
'1008': 024901
'1009': 024912
'1010': 024915
'1011': 024917
'1012': 024963
'1013': 024975
'1014': 024983
'1015': 025028
'1016': 025029
'1017': '025030'
'1018': '025031'
'1019': '025032'
'1020': '025033'
'1021': '025055'
'1022': '025063'
'1023': '025066'
'1024': '025104'
'1025': '025124'
'1026': '025215'
'1027': '025216'
'1028': '025227'
'1029': '025232'
'1030': '025233'
'1031': '025234'
'1032': '025235'
'1033': '025324'
'1034': 025378
'1035': '025601'
'1036': '025603'
'1037': '025605'
'1038': '025606'
'1039': 025608
'1040': 025609
'1041': 025668
'1042': 025669
'1043': '025670'
'1044': 025795
'1045': 025796
'1046': 025797
'1047': 025802
'1048': 025804
'1049': '026007'
'1050': 026008
'1051': '026010'
'1052': '026011'
'1053': '026012'
'1054': '026013'
'1055': '026014'
'1056': '026016'
'1057': '026017'
'1058': '026020'
'1059': '026021'
'1060': '026022'
'1061': '026025'
'1062': '026026'
'1063': '026034'
'1064': '026035'
'1065': '026036'
'1066': 026169
'1067': '026174'
'1068': 026298
'1069': '026301'
'1070': '026302'
'1071': '026307'
'1072': '026322'
'1073': '026464'
'1074': '026465'
'1075': '026466'
'1076': 026583
'1077': '026600'
'1078': '026605'
'1079': 026629
'1080': 026638
'1081': 026639
'1082': '026640'
'1083': '026641'
'1084': '026642'
'1085': '026643'
'1086': '026651'
'1087': '026652'
'1088': '026653'
'1089': '026654'
'1090': '026655'
'1091': '026656'
'1092': '026657'
'1093': 026658
'1094': 026659
'1095': '026674'
'1096': 026681
'1097': '026754'
'1098': '026765'
'1099': 026859
'1100': 026861
'1101': 026902
'1102': 026904
'1103': 026905
'1104': 026906
'1105': '027164'
'1106': '027177'
'1107': 027194
'1108': 027195
'1109': 027197
'1110': 027198
'1111': 027258
'1112': '027406'
'1113': '027454'
'1114': '027455'
'1115': '027456'
'1116': '027547'
'1117': 027548
'1118': 027549
'1119': '027550'
'1120': '027551'
'1121': '027552'
'1122': 027609
'1123': '027610'
'1124': '027611'
'1125': '027612'
'1126': '027613'
'1127': '027667'
'1128': '027673'
'1129': 027797
'1130': 027798
'1131': 027799
'1132': 027802
'1133': 027803
'1134': 027804
'1135': 027805
'1136': 027855
'1137': 027856
'1138': 027866
'1139': 027945
'1140': 027953
'1141': 027975
'1142': 027978
'1143': 027981
'1144': 027987
'1145': 028070
'1146': 028072
'1147': 028179
'1148': 028241
'1149': 028260
'1150': 028266
'1151': 028274
'1152': 028375
'1153': 028376
'1154': 028477
'1155': 028478
'1156': 028479
'1157': 028480
'1158': 028481
'1159': 028482
'1160': 028483
'1161': 028484
'1162': 028485
'1163': 028546
'1164': 028548
'1165': 028553
'1166': 028571
'1167': 028608
'1168': 028692
'1169': 028802
'1170': 029037
'1171': 029039
'1172': 029040
'1173': 029041
'1174': 029042
'1175': 029043
'1176': 029044
'1177': 029045
'1178': 029128
'1179': 029180
'1180': 029243
'1181': 029245
'1182': 029255
'1183': 029271
'1184': 029272
'1185': 029350
'1186': 029351
'1187': 029355
'1188': 029465
'1189': 029480
'1190': 029526
'1191': 029528
'1192': 029530
'1193': 029587
'1194': 029602
'1195': 029673
'1196': 029718
'1197': 029719
'1198': 029720
'1199': 029721
'1200': 029738
'1201': 029739
'1202': 029740
'1203': 029741
'1204': 029742
'1205': 029744
'1206': 029745
'1207': 029746
'1208': 029747
'1209': 029750
'1210': 029752
'1211': 029807
'1212': 029813
'1213': 029816
'1214': 029961
'1215': 029971
'1216': '030041'
'1217': '030043'
'1218': '030050'
'1219': '030056'
'1220': 030058
'1221': 030059
'1222': 030090
'1223': 030095
'1224': '030120'
'1225': 030196
'1226': 030198
'1227': '030230'
'1228': '030316'
'1229': 030486
'1230': 030487
'1231': 030488
'1232': 030519
'1233': '030520'
'1234': '030521'
'1235': '030522'
'1236': '030636'
'1237': 030682
'1238': 030690
'1239': '030702'
'1240': '030740'
'1241': 030895
'1242': '031040'
'1243': '031041'
'1244': '031042'
'1245': '031043'
'1246': '031044'
'1247': '031165'
'1248': '031356'
'1249': 031389
'1250': 031390
'1251': 031391
'1252': 031392
'1253': 031568
'1254': 031807
'1255': 031887
'1256': 031888
'1257': 031889
'1258': 031999
'1259': '032001'
'1260': '032021'
'1261': '032075'
'1262': 032081
'1263': 032218
'1264': '032325'
'1265': '032326'
'1266': '032327'
'1267': 032328
'1268': 032329
'1269': '032330'
'1270': '032331'
'1271': '032332'
'1272': '032333'
'1273': '032334'
'1274': '032335'
'1275': '032336'
'1276': '032337'
'1277': 032338
'1278': 032339
'1279': '032340'
'1280': '032433'
'1281': '032435'
'1282': '032437'
'1283': 032438
'1284': 032439
'1285': '032525'
'1286': 032686
'1287': 032687
'1288': 032689
'1289': 032693
'1290': 032694
'1291': 032695
'1292': '032755'
'1293': '032756'
'1294': 032759
'1295': '032760'
'1296': 032800
'1297': 032882
'1298': '033020'
'1299': 033049
'1300': '033050'
'1301': '033064'
'1302': '033067'
'1303': 033068
'1304': 033069
'1305': '033070'
'1306': '033071'
'1307': '033072'
'1308': '033123'
'1309': '033124'
'1310': '033203'
'1311': '033216'
'1312': '033221'
'1313': 033278
'1314': '033415'
'1315': '033422'
'1316': '033424'
'1317': '033426'
'1318': '033446'
'1319': 033459
'1320': '033460'
'1321': '033461'
'1322': '033465'
'1323': '033477'
'1324': 033486
'1325': 033538
'1326': 033992
'1327': '034003'
'1328': '034147'
'1329': '034167'
'1330': '034257'
'1331': 034258
'1332': '034263'
'1333': 034484
'1334': '034510'
'1335': '034511'
'1336': 034994
'1337': 034996
'1338': '035007'
'1339': 035008
'1340': 035182
'1341': 035184
'1342': 035198
'1343': 035199
'1344': '035204'
'1345': 035296
'1346': 035299
'1347': '035443'
'1348': '035444'
'1349': '035462'
'1350': '035527'
'1351': '035534'
'1352': '035535'
'1353': '035537'
'1354': 035539
'1355': '035541'
'1356': '035543'
'1357': '035544'
'1358': '035545'
'1359': 035549
'1360': '035550'
'1361': 035569
'1362': '035571'
'1363': 035608
'1364': '035734'
'1365': 036096
'1366': 036097
'1367': 036099
'1368': '036143'
'1369': '036144'
'1370': '036145'
'1371': '036146'
'1372': '036147'
'1373': '036245'
'1374': '036257'
'1375': 036258
'1376': '036261'
'1377': '036272'
'1378': '036273'
'1379': '036275'
'1380': '036277'
'1381': '036302'
'1382': '036304'
'1383': '036322'
'1384': '036333'
'1385': '036371'
'1386': 036380
'1387': 036388
'1388': 036428
'1389': '036435'
'1390': 036481
'1391': '036526'
'1392': '036560'
'1393': '036567'
'1394': '036614'
'1395': '036615'
'1396': '036616'
'1397': 036618
'1398': '036643'
'1399': 036659
'1400': 036799
'1401': 036959
'1402': 036961
'1403': 036965
'1404': 036966
'1405': 036983
'1406': 036984
'1407': 036985
'1408': 036986
'1409': 036987
'1410': 036988
'1411': 036990
'1412': 036992
'1413': 036994
'1414': 036997
'1415': 036998
'1416': 036999
'1417': '037041'
'1418': '037111'
'1419': '037113'
'1420': 037119
'1421': '037121'
'1422': '037131'
'1423': '037136'
'1424': '037141'
'1425': '037147'
'1426': '037324'
'1427': '037325'
'1428': 037368
'1429': 037369
'1430': '037416'
'1431': '037417'
'1432': '037423'
'1433': 037538
'1434': 037592
'1435': '037725'
'1436': '037727'
'1437': '037730'
'1438': '037731'
'1439': 037779
'1440': 037781
'1441': 037784
'1442': 037859
'1443': 037911
'1444': 037920
'1445': 038312
'1446': 038321
'1447': 038323
'1448': 038326
'1449': 038351
'1450': 038352
'1451': 038353
'1452': 038354
'1453': 038361
'1454': 038362
'1455': 038363
'1456': 038365
'1457': 038399
'1458': 038435
'1459': 038450
'1460': 038522
'1461': 038557
'1462': 038560
'1463': 038775
'1464': 038776
'1465': 038777
'1466': 038778
'1467': 038779
'1468': 038780
'1469': 038781
'1470': 038782
'1471': 038783
'1472': 038784
'1473': 038785
'1474': 038817
'1475': 038818
'1476': 038819
'1477': 038820
'1478': 038821
'1479': 038822
'1480': 038823
'1481': 038824
'1482': 038825
'1483': 038826
'1484': 038827
'1485': 038828
'1486': 038829
'1487': 038830
'1488': 038833
'1489': 038834
'1490': 038847
'1491': 038859
'1492': 038878
'1493': 038879
'1494': 038880
'1495': 038881
'1496': 038882
'1497': 038884
'1498': 038886
'1499': 038887
'1500': 038888
'1501': 038890
'1502': 038891
'1503': 038892
'1504': 038893
'1505': 038894
'1506': 038895
'1507': 038896
'1508': 038898
'1509': 038899
'1510': 038900
'1511': 038901
'1512': 038902
'1513': 038904
'1514': 038905
'1515': 038906
'1516': 038907
'1517': 038908
'1518': 038910
'1519': 038911
'1520': 038912
'1521': 038914
'1522': 038955
'1523': 038961
'1524': 038964
'1525': 038965
'1526': 038966
'1527': 038967
'1528': 039188
'1529': 039259
'1530': 039278
'1531': 039291
'1532': 039298
'1533': 039316
'1534': 039317
'1535': 039318
'1536': 039357
'1537': 039359
'1538': 039378
'1539': 039484
'1540': 039488
'1541': 039530
'1542': 039605
'1543': 039607
'1544': 039658
'1545': 039659
'1546': 039660
'1547': 039661
'1548': 039662
'1549': 039663
'1550': 039664
'1551': 039665
'1552': 039666
'1553': 039667
'1554': 039875
'1555': 039900
'1556': 039904
'1557': '040121'
'1558': '040122'
'1559': '040123'
'1560': '040133'
'1561': '040134'
'1562': 040139
'1563': '040141'
'1564': '040147'
'1565': '040161'
'1566': 040180
'1567': 040182
'1568': 040229
'1569': '040230'
'1570': '040231'
'1571': '040232'
'1572': '040233'
'1573': '040234'
'1574': '040235'
'1575': '040236'
'1576': '040237'
'1577': 040238
'1578': 040239
'1579': '040240'
'1580': '040241'
'1581': '040242'
'1582': '040243'
'1583': '040244'
'1584': '040245'
'1585': '040250'
'1586': 040509
'1587': '040525'
'1588': '040541'
'1589': '040542'
'1590': 040598
'1591': '040654'
'1592': '040655'
'1593': '040656'
'1594': '040657'
'1595': 040658
'1596': 040659
'1597': '040660'
'1598': 040683
'1599': '040725'
'1600': 040842
'1601': 040843
'1602': 040844
'1603': 040845
'1604': 040851
'1605': 040903
'1606': 040908
'1607': 040909
'1608': 040938
'1609': 040940
'1610': 040984
'1611': 040985
'1612': 040986
'1613': 041018
'1614': 041019
'1615': '041020'
'1616': '041054'
'1617': 041095
'1618': '041147'
'1619': 041191
'1620': 041192
'1621': '041310'
'1622': 041381
'1623': 041568
'1624': '041570'
'1625': '041573'
'1626': '041605'
'1627': 041709
'1628': '041714'
'1629': 041812
'1630': 041819
'1631': 041820
'1632': 041825
'1633': 041961
'1634': 041962
'1635': 041965
'1636': 041971
'1637': 041983
'1638': '042014'
'1639': '042016'
'1640': '042017'
'1641': 042018
'1642': 042019
'1643': '042020'
'1644': '042023'
'1645': '042025'
'1646': 042029
'1647': '042030'
'1648': '042031'
'1649': '042040'
'1650': '042044'
'1651': '042045'
'1652': '042046'
'1653': 042048
'1654': 042119
'1655': '042126'
'1656': 042129
'1657': '042135'
'1658': 042138
'1659': 042139
'1660': '042141'
'1661': '042146'
'1662': '042234'
'1663': '042235'
'1664': '042236'
'1665': 042238
'1666': '042240'
'1667': '042241'
'1668': '042243'
'1669': '042245'
'1670': '042247'
'1671': '042310'
'1672': '042372'
'1673': '042373'
'1674': '042374'
'1675': '042375'
'1676': '042376'
'1677': '042377'
'1678': '042442'
'1679': '042463'
'1680': '042475'
'1681': 042648
'1682': 042659
'1683': '042751'
'1684': '042761'
'1685': 042789
'1686': 042844
'1687': 042851
'1688': 042911
'1689': 042914
'1690': 042915
'1691': 042966
'1692': 042984
'1693': '043016'
'1694': 043018
'1695': 043019
'1696': '043020'
'1697': '043021'
'1698': '043022'
'1699': '043023'
'1700': '043024'
'1701': '043025'
'1702': '043026'
'1703': '043027'
'1704': 043028
'1705': 043029
'1706': '043030'
'1707': '043063'
'1708': '043172'
'1709': '043173'
'1710': '043516'
'1711': '043517'
'1712': 043518
'1713': 043519
'1714': '043520'
'1715': '043521'
'1716': '043533'
'1717': '043534'
'1718': '043535'
'1719': '043536'
'1720': 043585
'1721': 043586
'1722': 043587
'1723': 043588
'1724': 043589
'1725': 043590
'1726': 043592
'1727': 043593
'1728': 043594
'1729': 043595
'1730': 043596
'1731': 043598
'1732': 043599
'1733': '043600'
'1734': 043608
'1735': '043621'
'1736': '043623'
'1737': 043691
'1738': 043695
'1739': 043696
'1740': 043697
'1741': 043698
'1742': 043699
'1743': '043761'
'1744': '043765'
'1745': '043766'
'1746': '043767'
'1747': 043768
'1748': '043773'
'1749': 043796
'1750': 043842
'1751': 043843
'1752': 043844
'1753': 043856
'1754': 043857
'1755': 043858
'1756': 043859
'1757': 043860
'1758': 043861
'1759': 043863
'1760': 043865
'1761': 043866
'1762': 043867
'1763': 043868
'1764': 043869
'1765': 043883
'1766': 043886
'1767': 043899
'1768': 043911
'1769': 043962
'1770': 043965
'1771': 044092
'1772': '044110'
'1773': 044169
'1774': '044236'
'1775': '044342'
'1776': '044347'
'1777': '044354'
'1778': '044355'
'1779': '044777'
'1780': 044778
'1781': 044779
'1782': 044780
'1783': 044781
'1784': 044782
'1785': 044791
'1786': 044792
'1787': 044793
'1788': 044794
'1789': 044795
'1790': 044796
'1791': 044797
'1792': 044798
'1793': 044799
'1794': 044800
'1795': 044801
'1796': 044802
'1797': 044803
'1798': 044804
'1799': 044805
'1800': 044806
'1801': 044809
'1802': 044820
'1803': 044821
'1804': 044822
'1805': 044823
'1806': 044848
'1807': 044849
'1808': 044850
'1809': 044851
'1810': 044853
'1811': 044854
'1812': 044917
'1813': 044918
'1814': 044946
'1815': 044947
'1816': 044948
'1817': 044949
'1818': 044950
'1819': 044951
'1820': 044952
'1821': '045055'
'1822': 045099
'1823': '045100'
'1824': '045101'
'1825': '045102'
'1826': '045103'
'1827': 045119
'1828': '045122'
'1829': '045125'
'1830': '045126'
'1831': '045127'
'1832': 045128
'1833': 045149
'1834': '045150'
'1835': '045151'
'1836': '045152'
'1837': '045153'
'1838': '045154'
'1839': '045335'
'1840': 045387
'1841': 045388
'1842': 045389
'1843': 045390
'1844': 045391
'1845': 045392
'1846': 045393
'1847': '045474'
'1848': '045475'
'1849': 045508
'1850': '045513'
'1851': '045514'
'1852': '045515'
'1853': '045516'
'1854': '045517'
'1855': 045518
'1856': 045519
'1857': '045520'
'1858': '045521'
'1859': '045522'
'1860': '045523'
'1861': 045934
'1862': 045941
'1863': '046024'
'1864': '046043'
'1865': 046058
'1866': 046068
'1867': 046078
'1868': 046079
'1869': '046157'
'1870': 046158
'1871': 046159
'1872': '046160'
'1873': '046161'
'1874': '046162'
'1875': 046238
'1876': '046241'
'1877': '046525'
'1878': '046611'
'1879': '046711'
'1880': '046717'
'1881': 046718
'1882': '046720'
'1883': '046726'
'1884': '046732'
'1885': '046733'
'1886': '046736'
'1887': 046839
'1888': 046840
'1889': 046841
'1890': 046842
'1891': 046844
'1892': 046846
'1893': 046854
'1894': 046855
'1895': 046928
'1896': 046930
'1897': '047032'
'1898': 047068
'1899': 047069
'1900': '047070'
'1901': '047071'
'1902': '047072'
'1903': '047073'
'1904': '047074'
'1905': '047075'
'1906': '047076'
'1907': '047077'
'1908': '047100'
'1909': 047192
'1910': 047193
'1911': 047194
'1912': 047195
'1913': 047196
'1914': 047197
'1915': 047198
'1916': 047199
'1917': '047200'
'1918': '047201'
'1919': '047202'
'1920': '047260'
'1921': '047471'
'1922': '047506'
'1923': '047510'
'1924': '047526'
'1925': 047628
'1926': '047657'
'1927': 047658
'1928': 047659
'1929': '047660'
'1930': '047661'
'1931': '047662'
'1932': '047663'
'1933': '047665'
'1934': '047666'
'1935': '047670'
'1936': '047671'
'1937': '047707'
'1938': 047826
'1939': 047835
'1940': 047865
'1941': 047868
'1942': 047894
'1943': 047895
'1944': 047896
'1945': 047897
'1946': 047916
'1947': 047921
'1948': 048015
'1949': 048042
'1950': 048043
'1951': 048044
'1952': 048046
'1953': 048269
'1954': 048293
'1955': 048307
'1956': 048317
'1957': 048367
'1958': 048368
'1959': 048369
'1960': 048437
'1961': 048439
'1962': 048440
'1963': 048442
'1964': 048443
'1965': 048444
'1966': 048446
'1967': 048450
'1968': 048452
'1969': 048453
'1970': 048454
'1971': 048456
'1972': 048457
'1973': 048462
'1974': 048463
'1975': 048464
'1976': 048465
'1977': 048466
'1978': 048488
'1979': 048489
'1980': 048491
'1981': 048492
'1982': 048493
'1983': 048494
'1984': 048763
'1985': 048808
'1986': 048815
'1987': 048861
'1988': 048862
'1989': 048863
'1990': 048864
'1991': 048865
'1992': 048931
'1993': 048990
'1994': 048999
'1995': 049029
'1996': 049030
'1997': 049039
'1998': 049061
'1999': 049062
'2000': 049064
'2001': 049066
'2002': 049067
'2003': 049068
'2004': 049070
'2005': 049071
'2006': 049072
'2007': 049073
'2008': 049394
'2009': 049401
'2010': 049407
'2011': 049408
'2012': 049441
'2013': 049473
'2014': 049476
'2015': 049477
'2016': 049478
'2017': 049479
'2018': 049812
'2019': 049817
'2020': 049842
'2021': 049843
'2022': 049844
'2023': 049845
'2024': 049846
'2025': 049847
'2026': 049848
'2027': 049849
'2028': 049856
'2029': 049857
'2030': '050264'
'2031': '050272'
'2032': '050276'
'2033': 050283
'2034': '050323'
'2035': '050444'
'2036': '050445'
'2037': '050446'
'2038': '050447'
'2039': 050448
'2040': 050449
'2041': 050539
'2042': '050543'
'2043': '050752'
'2044': '050753'
'2045': '050754'
'2046': 050836
'2047': 050952
'2048': 050955
'2049': 050956
'2050': '051004'
'2051': '051005'
'2052': '051006'
'2053': '051111'
'2054': '051112'
'2055': '051113'
'2056': '051114'
'2057': '051115'
'2058': '051117'
'2059': 051118
'2060': '051120'
'2061': '051157'
'2062': 051158
'2063': '051203'
'2064': '051260'
'2065': '051261'
'2066': '051262'
'2067': '051263'
'2068': '051265'
'2069': '051267'
'2070': 051268
'2071': 051269
'2072': '051271'
'2073': '051272'
'2074': '051273'
'2075': '051274'
'2076': '051275'
'2077': '051276'
'2078': 051278
'2079': 051291
'2080': 051292
'2081': '051301'
'2082': '051305'
'2083': '051333'
'2084': 051479
'2085': '051655'
'2086': 051659
'2087': '051661'
'2088': '051776'
'2089': 051784
'2090': 051785
'2091': 051918
'2092': 051919
'2093': 051923
'2094': 051954
'2095': 051991
'2096': 051992
'2097': 051998
'2098': 051999
'2099': '052000'
'2100': '052001'
'2101': '052034'
'2102': '052035'
'2103': '052036'
'2104': '052037'
'2105': 052039
'2106': '052040'
'2107': '052041'
'2108': '052042'
'2109': '052044'
'2110': '052045'
'2111': 052118
'2112': 052119
'2113': '052120'
'2114': '052121'
'2115': '052122'
'2116': '052123'
'2117': '052124'
'2118': '052125'
'2119': '052126'
'2120': '052127'
'2121': 052128
'2122': 052129
'2123': '052141'
'2124': '052375'
'2125': 052380
'2126': 052389
'2127': 052393
'2128': 052409
'2129': '052446'
'2130': '052447'
'2131': 052448
'2132': 052449
'2133': '052451'
'2134': '052452'
'2135': '052500'
'2136': '052501'
'2137': '052502'
'2138': 052508
'2139': '052522'
'2140': 052579
'2141': 052628
'2142': 052629
'2143': '052630'
'2144': '052631'
'2145': '052632'
'2146': '052633'
'2147': '052634'
'2148': '052635'
'2149': '052636'
'2150': '052637'
'2151': 052638
'2152': 052639
'2153': '052641'
'2154': '052642'
'2155': '052644'
'2156': '052645'
'2157': '052646'
'2158': '052647'
'2159': 052648
'2160': 052649
'2161': '052650'
'2162': 052859
'2163': 052860
'2164': 052861
'2165': 052862
'2166': 052945
'2167': 052946
'2168': 052947
'2169': 052948
'2170': 052950
'2171': 052951
'2172': 052953
'2173': 052954
'2174': 052955
'2175': '053152'
'2176': '053154'
'2177': '053156'
'2178': '053157'
'2179': 053158
'2180': 053159
'2181': '053160'
'2182': 053228
'2183': 053229
'2184': 053299
'2185': '053300'
'2186': '053301'
'2187': '053302'
'2188': 053379
'2189': 053381
'2190': '053457'
'2191': 053496
'2192': '053576'
'2193': 053578
'2194': 053586
'2195': 053587
'2196': 053588
'2197': 053589
'2198': 053591
'2199': 053592
'2200': '053675'
'2201': '053723'
'2202': '053724'
'2203': '053725'
'2204': '053726'
'2205': '053727'
'2206': 053728
'2207': 053729
'2208': 053807
'2209': 053862
'2210': 053863
'2211': 053937
'2212': 054019
'2213': '054031'
'2214': '054032'
'2215': '054033'
'2216': '054034'
'2217': '054037'
'2218': 054039
'2219': '054061'
'2220': '054062'
'2221': '054063'
'2222': '054064'
'2223': 054149
'2224': '054150'
'2225': '054151'
'2226': '054152'
'2227': '054153'
'2228': '054154'
'2229': '054155'
'2230': '054156'
'2231': 054158
'2232': 054159
'2233': '054160'
'2234': '054163'
'2235': '054234'
'2236': '054235'
'2237': '054236'
'2238': '054237'
'2239': 054297
'2240': '054335'
'2241': '054365'
'2242': '054376'
'2243': '054433'
'2244': '054436'
'2245': '054437'
'2246': 054438
'2247': '054442'
'2248': '054443'
'2249': '054463'
'2250': '054464'
'2251': '054465'
'2252': '054466'
'2253': '054467'
'2254': 054468
'2255': 054469
'2256': '054470'
'2257': '054475'
'2258': '054476'
'2259': 054479
'2260': 054480
'2261': 054481
'2262': 054482
'2263': 054496
'2264': '054554'
'2265': 054568
'2266': '054570'
'2267': '054576'
'2268': 054578
'2269': 054580
'2270': '054621'
'2271': '054623'
'2272': '054624'
'2273': '054625'
'2274': '054626'
'2275': '054662'
'2276': '054664'
'2277': '054665'
'2278': '054666'
'2279': '054667'
'2280': '054703'
'2281': 054719
'2282': '054735'
'2283': '054753'
'2284': 054874
'2285': 054942
'2286': '055076'
'2287': 055097
'2288': '055100'
'2289': '055101'
'2290': '055102'
'2291': '055113'
'2292': 055119
'2293': '055120'
'2294': '055121'
'2295': '055122'
'2296': '055123'
'2297': '055124'
'2298': 055149
'2299': 055183
'2300': 055186
'2301': '055231'
'2302': '055232'
'2303': '055233'
'2304': '055234'
'2305': '055235'
'2306': '055236'
'2307': '055237'
'2308': 055238
'2309': '055240'
'2310': '055241'
'2311': '055242'
'2312': 055285
'2313': 055286
'2314': 055287
'2315': 055288
'2316': 055289
'2317': 055290
'2318': 055291
'2319': 055292
'2320': 055293
'2321': 055294
'2322': 055295
'2323': '055402'
'2324': '055430'
'2325': '055436'
'2326': '055437'
'2327': 055480
'2328': 055481
'2329': 055549
'2330': '055572'
'2331': 055709
'2332': '055710'
'2333': '055711'
'2334': '055712'
'2335': '055713'
'2336': '055714'
'2337': '055715'
'2338': '055716'
'2339': '055717'
'2340': 055718
'2341': 055719
'2342': 055782
'2343': 055783
'2344': 055786
'2345': 055807
'2346': 055808
'2347': 055809
'2348': 055810
'2349': 055811
'2350': 055826
'2351': 055827
'2352': 055828
'2353': 055830
'2354': 055831
'2355': 055832
'2356': 055833
'2357': 055900
'2358': '056010'
'2359': '056015'
'2360': '056020'
'2361': 056028
'2362': 056029
'2363': '056030'
'2364': '056031'
'2365': '056033'
'2366': '056034'
'2367': '056036'
'2368': '056247'
'2369': 056248
'2370': 056249
'2371': '056273'
'2372': '056274'
'2373': '056275'
'2374': '056460'
'2375': '056465'
'2376': '056466'
'2377': '056467'
'2378': 056468
'2379': 056469
'2380': '056470'
'2381': '056471'
'2382': '056472'
'2383': '056474'
'2384': 056493
'2385': 056495
'2386': 056496
'2387': 056497
'2388': 056498
'2389': 056499
'2390': '056516'
'2391': '056517'
'2392': 056518
'2393': 056519
'2394': '056520'
'2395': '056521'
'2396': '056523'
'2397': '056552'
'2398': 056559
'2399': 056639
'2400': '056640'
'2401': '056641'
'2402': '056645'
'2403': '056646'
'2404': 056648
'2405': 056649
'2406': '056650'
'2407': '056651'
'2408': 056686
'2409': 056687
'2410': 056688
'2411': 056689
'2412': 056690
'2413': 056691
'2414': 056692
'2415': 056693
'2416': 056694
'2417': 056695
'2418': 056696
'2419': 056795
'2420': 056796
'2421': 056797
'2422': 056798
'2423': 056799
'2424': 056800
'2425': 056801
'2426': 056802
'2427': 056803
'2428': 056804
'2429': 056805
'2430': 056874
'2431': 056888
'2432': 056895
'2433': 056929
'2434': 057078
'2435': '057164'
'2436': '057175'
'2437': '057176'
'2438': '057177'
'2439': 057178
'2440': 057179
'2441': 057180
'2442': '057271'
'2443': '057272'
'2444': '057273'
'2445': '057274'
'2446': '057344'
'2447': '057360'
'2448': '057371'
'2449': '057417'
'2450': 057418
'2451': '057435'
'2452': '057437'
'2453': 057439
'2454': '057440'
'2455': '057442'
'2456': '057500'
'2457': '057540'
'2458': 057569
'2459': '057626'
'2460': '057627'
'2461': 057628
'2462': 057629
'2463': '057630'
'2464': 057639
'2465': '057640'
'2466': 057648
'2467': 057658
'2468': '057661'
'2469': '057662'
'2470': '057663'
'2471': '057665'
'2472': 057691
'2473': 057697
'2474': 057819
'2475': 057820
'2476': 057821
'2477': 057822
'2478': 057823
'2479': 057891
'2480': 057892
'2481': 057936
'2482': 057937
'2483': 057938
'2484': 057939
'2485': 057943
'2486': 057968
'2487': 058052
'2488': 058053
'2489': 058054
'2490': 058060
'2491': 058061
'2492': 058063
'2493': 058068
'2494': 058070
'2495': 058115
'2496': 058116
'2497': 058117
'2498': 058135
'2499': 058140
'2500': 058161
'2501': 058162
'2502': 058164
'2503': 058166
'2504': 058169
'2505': 058170
'2506': 058173
'2507': 058174
'2508': 058207
'2509': 058212
'2510': 058213
'2511': 058215
'2512': 058221
'2513': 058225
'2514': 058333
'2515': 058334
'2516': 058341
'2517': 058474
'2518': 058539
'2519': 058540
'2520': 058541
'2521': 058542
'2522': 058543
'2523': 059078
'2524': 059373
'2525': 059374
'2526': 059443
'2527': 059445
'2528': 059446
'2529': 059448
'2530': 059449
'2531': 059451
'2532': 059454
'2533': 059561
'2534': 059562
'2535': 059581
'2536': 059653
'2537': 059654
'2538': 059656
'2539': 059657
'2540': 059658
'2541': 059659
'2542': 059660
'2543': 059663
'2544': 059664
'2545': 059666
'2546': 059667
'2547': 059669
'2548': 059671
'2549': 059673
'2550': 059675
'2551': 059676
'2552': 059677
'2553': 059678
'2554': 059679
'2555': 059680
'2556': 059681
'2557': 059682
'2558': 059683
'2559': 059684
'2560': 059685
'2561': 059686
'2562': 059687
'2563': 059688
'2564': 059695
'2565': 059702
'2566': 059706
'2567': 059707
'2568': 059708
'2569': 059709
'2570': 059710
'2571': 059711
'2572': 059718
'2573': 059719
'2574': 059720
'2575': 059721
'2576': 059723
'2577': 059724
'2578': 059725
'2579': 059726
'2580': 059727
'2581': 059823
'2582': 059876
'2583': 059930
'2584': '060037'
'2585': 060038
'2586': '060041'
'2587': '060042'
'2588': '060045'
'2589': 060048
'2590': '060074'
'2591': '060143'
'2592': '060144'
'2593': '060145'
'2594': '060146'
'2595': '060170'
'2596': '060317'
'2597': '060331'
'2598': '060472'
'2599': '060474'
'2600': '060476'
'2601': '060477'
'2602': 060478
'2603': '060510'
'2604': '060533'
'2605': '060534'
'2606': '060535'
'2607': '060536'
'2608': '060537'
'2609': '060544'
'2610': '060547'
'2611': 060548
'2612': 060549
'2613': '060736'
'2614': '060753'
'2615': '060754'
'2616': '060755'
'2617': '060756'
'2618': '060757'
'2619': 060758
'2620': '060775'
'2621': '060776'
'2622': '060777'
'2623': 060857
'2624': 060864
'2625': 060865
'2626': 060871
'2627': 060872
'2628': 060873
'2629': 060874
'2630': 060875
'2631': 060994
'2632': '061006'
'2633': '061007'
'2634': 061008
'2635': '061010'
'2636': '061011'
'2637': '061012'
'2638': '061013'
'2639': 061159
'2640': '061160'
'2641': '061161'
'2642': '061172'
'2643': '061174'
'2644': '061175'
'2645': '061452'
'2646': '061453'
'2647': 061491
'2648': 061492
'2649': 061493
'2650': 061587
'2651': 061589
'2652': 061591
'2653': 061592
'2654': 061668
'2655': '061670'
'2656': 061679
'2657': '061734'
'2658': '061736'
'2659': '061742'
'2660': 061814
'2661': 061820
'2662': 061821
'2663': 061884
'2664': '062001'
'2665': '062003'
'2666': '062005'
'2667': '062007'
'2668': '062163'
'2669': '062164'
'2670': '062165'
'2671': 062180
'2672': 062183
'2673': 062184
'2674': 062185
'2675': 062186
'2676': 062187
'2677': 062188
'2678': 062189
'2679': 062190
'2680': 062191
'2681': 062192
'2682': 062193
'2683': 062194
'2684': 062195
'2685': 062196
'2686': '062337'
'2687': '062426'
'2688': '062436'
'2689': '062445'
'2690': '062446'
'2691': 062448
'2692': 062449
'2693': '062450'
'2694': '062452'
'2695': 062458
'2696': '062525'
'2697': '062526'
'2698': '062527'
'2699': 062528
'2700': 062529
'2701': '062531'
'2702': '062532'
'2703': '062533'
'2704': '062534'
'2705': 062586
'2706': 062589
'2707': 062591
'2708': 062592
'2709': 062594
'2710': 062595
'2711': 062596
'2712': '062655'
'2713': '062671'
'2714': '062742'
'2715': 062748
'2716': 062749
'2717': '062750'
'2718': '062751'
'2719': '062753'
'2720': '063043'
'2721': '063044'
'2722': '063045'
'2723': '063064'
'2724': '063065'
'2725': '063117'
'2726': 063149
'2727': 063159
'2728': '063161'
'2729': 063191
'2730': 063208
'2731': '063224'
'2732': '063226'
'2733': '063250'
'2734': '063251'
'2735': '063252'
'2736': '063253'
'2737': '063255'
'2738': '063257'
'2739': 063258
'2740': 063287
'2741': 063289
'2742': 063290
'2743': 063291
'2744': 063292
'2745': '063456'
'2746': '063457'
'2747': '063470'
'2748': '063471'
'2749': '063472'
'2750': '063626'
'2751': '063655'
'2752': '063733'
'2753': '063747'
'2754': '063755'
'2755': '063757'
'2756': '063770'
'2757': 063789
'2758': 063803
'2759': 063804
'2760': 063805
'2761': 063874
'2762': 063900
'2763': 063908
'2764': 063922
'2765': 063936
'2766': 063999
'2767': '064005'
'2768': '064006'
'2769': '064007'
'2770': 064008
'2771': 064009
'2772': '064035'
'2773': 064078
'2774': 064079
'2775': 064091
'2776': 064093
'2777': '064247'
'2778': 064248
'2779': 064249
'2780': '064252'
'2781': '064253'
'2782': '064331'
'2783': '064332'
'2784': '064333'
'2785': '064334'
'2786': 064338
'2787': '064364'
'2788': '064365'
'2789': '064366'
'2790': '064407'
'2791': 064408
'2792': 064409
'2793': '064410'
'2794': '064515'
'2795': '064516'
'2796': '064517'
'2797': 064519
'2798': '064520'
'2799': '064521'
'2800': '064522'
'2801': '064523'
'2802': '064535'
'2803': '064536'
'2804': '064537'
'2805': 064538
'2806': '064542'
'2807': '064553'
'2808': '064556'
'2809': '064567'
'2810': 064590
'2811': 064591
'2812': 064592
'2813': 064593
'2814': 064594
'2815': '064601'
'2816': '064604'
'2817': 064618
'2818': '064625'
'2819': '064626'
'2820': '064627'
'2821': 064628
'2822': 064629
'2823': '064630'
'2824': '064631'
'2825': 064659
'2826': 064787
'2827': 064788
'2828': 064789
'2829': 064796
'2830': 064809
'2831': 064834
'2832': 064840
'2833': 064841
'2834': 064854
'2835': 064855
'2836': 064856
'2837': 064857
'2838': 064858
'2839': 064859
'2840': 064860
'2841': 064861
'2842': 064862
'2843': 064863
'2844': 064864
'2845': 064865
'2846': 064866
'2847': 064893
'2848': 064895
'2849': 064896
'2850': 064918
'2851': 064919
'2852': 064988
'2853': 064989
'2854': 064990
'2855': 064991
'2856': 064992
'2857': 064993
'2858': 064994
'2859': 064995
'2860': '065037'
'2861': 065038
'2862': 065039
'2863': '065040'
'2864': '065063'
'2865': '065064'
'2866': '065073'
'2867': '065076'
'2868': '065077'
'2869': 065090
'2870': '065234'
'2871': '065265'
'2872': 065488
'2873': 065619
'2874': 065683
'2875': 065685
'2876': '065745'
'2877': '065752'
'2878': '065755'
'2879': '065756'
'2880': '065777'
'2881': 065779
'2882': 065780
'2883': 065893
'2884': 066058
'2885': '066073'
'2886': '066074'
'2887': '066075'
'2888': '066076'
'2889': 066180
'2890': 066187
'2891': 066390
'2892': 066394
'2893': '066405'
'2894': 066469
'2895': 066482
'2896': 066483
'2897': '066525'
'2898': '066534'
'2899': '066535'
'2900': '066536'
'2901': '066537'
'2902': 066538
'2903': 066539
'2904': '066636'
'2905': '066637'
'2906': 066638
'2907': '066641'
'2908': '066643'
'2909': '066644'
'2910': '066646'
'2911': 066648
'2912': 066649
'2913': '066650'
'2914': 066689
'2915': 066690
'2916': '066717'
'2917': '066757'
'2918': 066782
'2919': 066783
'2920': '067007'
'2921': '067010'
'2922': '067011'
'2923': '067016'
'2924': '067017'
'2925': '067121'
'2926': '067163'
'2927': '067232'
'2928': '067233'
'2929': '067235'
'2930': '067237'
'2931': 067308
'2932': '067330'
'2933': '067331'
'2934': '067332'
'2935': '067333'
'2936': '067334'
'2937': '067336'
'2938': '067357'
'2939': 067358
'2940': 067359
'2941': '067360'
'2942': '067361'
'2943': '067362'
'2944': '067363'
'2945': '067364'
'2946': '067365'
'2947': '067366'
'2948': '067367'
'2949': 067368
'2950': '067412'
'2951': '067457'
'2952': '067470'
'2953': '067500'
'2954': '067553'
'2955': '067556'
'2956': '067557'
'2957': 067558
'2958': 067597
'2959': 067598
'2960': '067600'
'2961': '067637'
'2962': 067638
'2963': 067639
'2964': '067640'
'2965': '067660'
'2966': '067661'
'2967': '067673'
'2968': '067707'
'2969': '067760'
'2970': '067763'
'2971': '067764'
'2972': '067765'
'2973': '067766'
'2974': 067784
'2975': 067793
'2976': 067829
'2977': 068353
'2978': 068354
'2979': 068355
'2980': 068356
'2981': 068404
'2982': 068407
'2983': 068410
'2984': 068444
'2985': 068531
'2986': 068536
'2987': 068537
'2988': 068538
'2989': 068539
'2990': 068540
'2991': 068541
'2992': 068543
'2993': 068549
'2994': 068551
'2995': 068573
'2996': 068579
'2997': 068582
'2998': 068587
'2999': 068592
'3000': 068600
'3001': 068601
'3002': 068680
'3003': 068682
'3004': 068683
'3005': 068820
'3006': 068821
'3007': 068837
'3008': 068838
'3009': 068839
'3010': 068840
'3011': 068841
'3012': 068842
'3013': 068843
'3014': 068844
'3015': 068851
'3016': 068852
'3017': 068853
'3018': 068854
'3019': 068860
'3020': 068861
'3021': 068862
'3022': 068869
'3023': 068872
'3024': 068875
'3025': 068891
'3026': 068892
'3027': 068893
'3028': 068894
'3029': 068895
'3030': 068896
'3031': 068897
'3032': 068898
'3033': 068899
'3034': 068909
'3035': 069001
'3036': 069002
'3037': 069170
'3038': 069181
'3039': 069182
'3040': 069188
'3041': 069193
'3042': 069194
'3043': 069195
'3044': 069196
'3045': 069197
'3046': 069198
'3047': 069199
'3048': 069200
'3049': 069201
'3050': 069202
'3051': 069203
'3052': 069204
'3053': 069205
'3054': 069206
'3055': 069207
'3056': 069208
'3057': 069209
'3058': 069210
'3059': 069211
'3060': 069221
'3061': 069222
'3062': 069223
'3063': 069303
'3064': 069554
'3065': 069555
'3066': 069561
'3067': 069563
'3068': 069564
'3069': 069567
'3070': 069682
'3071': 069723
'3072': 069726
'3073': 069727
'3074': 069732
'3075': 069744
'3076': 069745
'3077': 069746
'3078': 069747
'3079': 069761
'3080': 069762
'3081': 069763
'3082': 069764
'3083': 069765
'3084': 069766
'3085': 069767
'3086': 069768
'3087': 069781
'3088': 069784
'3089': 069785
'3090': 069787
'3091': 069788
'3092': 069789
'3093': 069791
'3094': 069792
'3095': 069793
'3096': 069798
'3097': 069822
'3098': 069823
'3099': 069824
'3100': 069825
'3101': 069826
'3102': 069827
'3103': 069828
'3104': 069830
'3105': 069833
'3106': 069904
'3107': 069947
'3108': 069949
'3109': 069985
'3110': '070002'
'3111': '070005'
'3112': '070174'
'3113': '070206'
'3114': '070207'
'3115': 070208
'3116': 070299
'3117': '070300'
'3118': '070301'
'3119': '070302'
'3120': '070303'
'3121': '070402'
'3122': '070403'
'3123': 070409
'3124': '070423'
'3125': '070424'
'3126': '070425'
'3127': '070426'
'3128': '070654'
'3129': '070655'
'3130': '070657'
'3131': '070660'
'3132': 070768
'3133': '070770'
'3134': '070772'
'3135': '070773'
'3136': '070774'
'3137': '070775'
'3138': 070813
'3139': 070873
'3140': 070875
'3141': 070878
'3142': 070879
'3143': 071096
'3144': '071133'
'3145': '071157'
'3146': 071158
'3147': '071172'
'3148': '071173'
'3149': '071174'
'3150': '071175'
'3151': '071216'
'3152': '071225'
'3153': 071228
'3154': '071230'
'3155': '071231'
'3156': '071240'
'3157': '071241'
'3158': '071242'
'3159': '071243'
'3160': '071244'
'3161': '071245'
'3162': '071246'
'3163': '071247'
'3164': 071248
'3165': 071249
'3166': '071250'
'3167': '071251'
'3168': '071252'
'3169': '071253'
'3170': '071254'
'3171': '071255'
'3172': '071276'
'3173': '071303'
'3174': '071304'
'3175': '071371'
'3176': '071372'
'3177': '071420'
'3178': '071503'
'3179': '071506'
'3180': '071507'
'3181': 071508
'3182': 071509
'3183': '071510'
'3184': '071511'
'3185': '071512'
'3186': '071513'
'3187': '071514'
'3188': '071515'
'3189': '071516'
'3190': '071617'
'3191': '071620'
'3192': '071622'
'3193': 071690
'3194': 071691
'3195': 071692
'3196': 071693
'3197': 071694
'3198': 071695
'3199': 071709
'3200': '071711'
'3201': '071714'
'3202': '071715'
'3203': 071719
'3204': '071721'
'3205': '071722'
'3206': 071822
'3207': 071884
'3208': 071885
'3209': 071937
'3210': 071938
'3211': '072046'
'3212': '072047'
'3213': '072050'
'3214': '072056'
'3215': 072058
'3216': 072059
'3217': '072064'
'3218': '072067'
'3219': 072068
'3220': 072069
'3221': '072070'
'3222': '072071'
'3223': '072072'
'3224': '072073'
'3225': '072074'
'3226': '072075'
'3227': '072076'
'3228': 072129
'3229': '072130'
'3230': '072131'
'3231': '072134'
'3232': '072135'
'3233': '072136'
'3234': '072146'
'3235': 072149
'3236': '072200'
'3237': '072206'
'3238': '072210'
'3239': '072215'
'3240': '072232'
'3241': '072233'
'3242': '072234'
'3243': 072287
'3244': 072288
'3245': 072289
'3246': 072290
'3247': '072456'
'3248': 072468
'3249': '072476'
'3250': '072477'
'3251': '072513'
'3252': '072514'
'3253': '072562'
'3254': '072565'
'3255': '072570'
'3256': '072604'
'3257': '072605'
'3258': '072607'
'3259': '072612'
'3260': 072738
'3261': 072781
'3262': 072782
'3263': 072783
'3264': 072784
'3265': 072785
'3266': 072786
'3267': 072787
'3268': 072788
'3269': 072789
'3270': 072790
'3271': 072926
'3272': 072927
'3273': 072928
'3274': 072930
'3275': 073087
'3276': 073099
'3277': '073100'
'3278': '073123'
'3279': '073124'
'3280': '073125'
'3281': 073169
'3282': '073170'
'3283': '073171'
'3284': '073172'
'3285': '073174'
'3286': '073175'
'3287': 073192
'3288': 073193
'3289': '073306'
'3290': 073309
'3291': 073318
'3292': '073335'
'3293': '073340'
'3294': '073341'
'3295': '073342'
'3296': '073343'
'3297': '073344'
'3298': '073363'
'3299': '073365'
'3300': '073366'
'3301': '073367'
'3302': 073368
'3303': 073369
'3304': '073370'
'3305': '073371'
'3306': '073372'
'3307': '073465'
'3308': '073466'
'3309': '073467'
'3310': 073468
'3311': 073469
'3312': 073486
'3313': 073494
'3314': 073495
'3315': 073519
'3316': '073520'
'3317': '073521'
'3318': '073522'
'3319': '073550'
'3320': '073551'
'3321': '073560'
'3322': '073561'
'3323': '073564'
'3324': '073565'
'3325': '073566'
'3326': 073568
'3327': '073572'
'3328': '073573'
'3329': 073580
'3330': 073584
'3331': 073585
'3332': 073587
'3333': 073658
'3334': '073675'
'3335': '073760'
'3336': '073761'
'3337': '073762'
'3338': '073763'
'3339': '073764'
'3340': '073765'
'3341': '073766'
'3342': '073767'
'3343': 073768
'3344': 073769
'3345': '073770'
'3346': '073771'
'3347': '073772'
'3348': '073773'
'3349': '073774'
'3350': '073775'
'3351': '073776'
'3352': '073777'
'3353': 073778
'3354': 073779
'3355': 073792
'3356': 073797
'3357': 073819
'3358': 073820
'3359': 073821
'3360': 073822
'3361': 073921
'3362': '074002'
'3363': '074302'
'3364': '074347'
'3365': 074348
'3366': '074362'
'3367': '074365'
'3368': '074370'
'3369': '074371'
'3370': '074372'
'3371': '074373'
'3372': '074374'
'3373': '074375'
'3374': '074376'
'3375': '074377'
'3376': 074378
'3377': 074380
'3378': 074381
'3379': 074382
'3380': 074383
'3381': 074384
'3382': 074385
'3383': 074386
'3384': 074387
'3385': 074388
'3386': 074389
'3387': 074390
'3388': 074391
'3389': 074392
'3390': 074393
'3391': '074421'
'3392': '074445'
'3393': '074546'
'3394': 074669
'3395': '074671'
'3396': '074706'
'3397': 074908
'3398': 074937
'3399': 074942
'3400': 074945
'3401': 074954
'3402': 074955
'3403': 074959
'3404': 074960
'3405': 075194
'3406': '075211'
'3407': '075221'
'3408': '075230'
'3409': '075304'
'3410': '075310'
'3411': '075314'
'3412': '075317'
'3413': '075371'
'3414': '075372'
'3415': '075373'
'3416': '075374'
'3417': '075375'
'3418': '075376'
'3419': '075377'
'3420': 075378
'3421': 075379
'3422': 075380
'3423': 075381
'3424': 075383
'3425': 075386
'3426': 075389
'3427': 075390
'3428': 075391
'3429': 075393
'3430': 075395
'3431': 075396
'3432': 075398
'3433': 075399
'3434': '075401'
'3435': '075403'
'3436': '075412'
'3437': '075415'
'3438': '075417'
'3439': 075418
'3440': 075419
'3441': '075420'
'3442': '075425'
'3443': '075427'
'3444': 075428
'3445': 075429
'3446': '075430'
'3447': '075431'
'3448': '075432'
'3449': '075433'
'3450': '075434'
'3451': '075435'
'3452': '075436'
'3453': '075437'
'3454': 075438
'3455': 075439
'3456': '075440'
'3457': '075441'
'3458': '075442'
'3459': '075443'
'3460': '075607'
'3461': '075612'
'3462': 075692
'3463': '075745'
'3464': '075746'
'3465': '075747'
'3466': 075748
'3467': 075749
'3468': '075750'
'3469': '075751'
'3470': '075752'
'3471': '075754'
'3472': '075755'
'3473': '075762'
'3474': '075763'
'3475': '075764'
'3476': 075782
'3477': 075783
'3478': 075784
'3479': 075785
'3480': 075786
'3481': 075787
'3482': 075788
'3483': 075844
'3484': 075862
'3485': 075866
'3486': 075869
'3487': 075883
'3488': 075903
'3489': 075908
'3490': 075925
'3491': 075926
'3492': 075927
'3493': 075928
'3494': 075929
'3495': 075930
'3496': 075931
'3497': 075932
'3498': 075933
'3499': 075935
'3500': 075936
'3501': 075937
'3502': 075975
'3503': '076036'
'3504': 076069
'3505': '076071'
'3506': '076072'
'3507': '076073'
'3508': '076074'
'3509': '076075'
'3510': '076076'
'3511': '076077'
'3512': 076078
'3513': 076079
'3514': '076121'
'3515': 076128
'3516': 076129
'3517': '076130'
'3518': '076131'
'3519': '076363'
'3520': '076375'
'3521': 076381
'3522': '076437'
'3523': '076440'
'3524': '076654'
'3525': 076659
'3526': '077517'
'3527': 077519
'3528': '077521'
'3529': '077522'
'3530': '077523'
'3531': '077564'
'3532': '077571'
'3533': '077572'
'3534': 077952
'3535': 078038
'3536': 078156
'3537': 078213
'3538': 078516
'3539': 078833
'3540': 078834
'3541': 078839
'3542': 078841
'3543': 078843
'3544': 078845
'3545': 078847
'3546': 078848
'3547': 078849
'3548': 078850
'3549': 078851
'3550': 078852
'3551': 078984
'3552': 078998
'3553': 079087
'3554': 079575
'3555': 079593
'3556': 079605
'3557': 079606
'3558': 079610
'3559': 079616
'3560': 079741
'3561': 079973
'3562': 079975
'3563': 079977
'3564': 079978
'3565': 079985
'3566': 079986
'3567': 079988
'3568': 079990
'3569': 079995
'3570': 080000
'3571': 080001
'3572': 080002
'3573': 080003
'3574': 080004
'3575': 080005
'3576': 080035
'3577': 080293
'3578': 080341
'3579': 080351
'3580': 080389
'3581': 080402
'3582': 080515
'3583': 080516
'3584': 080517
'3585': 080518
'3586': 080519
'3587': 080520
'3588': 080611
'3589': 080680
'3590': 080686
'3591': 080687
'3592': 080693
'3593': 080694
'3594': 080695
'3595': 080696
'3596': 080697
'3597': 080751
'3598': 080753
'3599': 080754
'3600': 080755
'3601': 080756
'3602': 080758
'3603': 080765
'3604': 080766
'3605': 080772
'3606': 080773
'3607': 080774
'3608': 080775
'3609': 080776
'3610': 080793
'3611': 080833
'3612': 080834
'3613': 080835
'3614': 080836
'3615': 081033
'3616': 081037
'3617': 081071
'3618': 081082
'3619': 081083
'3620': 081084
'3621': 081085
'3622': 081189
'3623': 081193
'3624': 081194
'3625': 081195
'3626': 081362
'3627': 081365
'3628': 081436
'3629': 081457
'3630': 081485
'3631': 081491
'3632': 081512
'3633': 081523
'3634': 081543
'3635': 081554
'3636': 081555
'3637': 081565
'3638': 081576
'3639': 081586
'3640': 081600
'3641': 081612
'3642': 081613
'3643': 081623
'3644': 081638
'3645': 081650
'3646': 081660
'3647': 081781
'3648': 081782
'3649': 081792
'3650': 081802
'3651': 081803
'3652': 081814
'3653': 081868
'3654': 081895
'3655': 081938
'3656': 081945
'3657': 081946
'3658': 081988
'3659': 081999
'3660': 082157
'3661': 082231
'3662': 082237
'3663': 082242
'3664': 082250
'3665': 082410
'3666': 082462
'3667': 082464
'3668': 082505
'3669': 082507
'3670': 082628
'3671': 082629
'3672': 082630
'3673': 082631
'3674': 082778
'3675': 082780
'3676': 082881
'3677': 082886
'3678': 082890
'3679': 082892
'3680': 082893
'3681': 082914
'3682': 082915
'3683': 082916
'3684': 082917
'3685': 082918
'3686': 082919
'3687': 082920
'3688': 082921
'3689': 082928
'3690': 082929
'3691': 082930
'3692': 082931
'3693': 082932
'3694': 083437
'3695': 083438
'3696': 083439
'3697': 083440
'3698': 083507
'3699': 083509
'3700': 083511
'3701': 083512
'3702': 083558
'3703': 083600
'3704': 083612
'3705': 083613
'3706': 083715
'3707': 083717
'3708': 083718
'3709': 083719
'3710': 083789
'3711': 083790
'3712': 083791
'3713': 083898
'3714': 083903
'3715': 083906
'3716': 083908
'3717': 083911
'3718': 083913
'3719': 083954
'3720': 083960
'3721': 083969
'3722': 084009
'3723': 084054
'3724': 084055
'3725': 084056
'3726': 084057
'3727': 084058
'3728': 084091
'3729': 084095
'3730': 084096
'3731': 084097
'3732': 084111
'3733': 084135
'3734': 084136
'3735': 084139
'3736': 084141
'3737': 084142
'3738': 084144
'3739': 084152
'3740': 084154
'3741': 084155
'3742': 084156
'3743': 084157
'3744': 084158
'3745': 084159
'3746': 084195
'3747': 084198
'3748': 084200
'3749': 084201
'3750': 084202
'3751': 084264
'3752': 084290
'3753': 084291
'3754': 084405
'3755': 084417
'3756': 084423
'3757': 084483
'3758': 084484
'3759': 084485
'3760': 084486
'3761': 084605
'3762': 084736
'3763': 084743
'3764': 084757
'3765': 084768
'3766': 084777
'3767': 084788
'3768': 084817
'3769': 085027
'3770': 085038
'3771': 085039
'3772': 085040
'3773': 085041
'3774': 085290
'3775': 085291
'3776': 085307
'3777': 085308
'3778': 085309
'3779': 085310
'3780': 085311
'3781': 085317
'3782': 085318
'3783': 085343
'3784': 085346
'3785': 085347
'3786': 085400
'3787': 085419
'3788': 085420
'3789': 085421
'3790': 085422
'3791': 085423
'3792': 085424
'3793': 085425
'3794': 085426
'3795': 085427
'3796': 085428
'3797': 085436
'3798': 085438
'3799': 085482
'3800': 085484
'3801': 085485
'3802': 085486
'3803': 085487
'3804': 085488
'3805': 085489
'3806': 085490
'3807': 085491
'3808': 085492
'3809': 085494
'3810': 085592
'3811': 085593
'3812': 085594
'3813': 085595
'3814': 085596
'3815': 085598
'3816': 085599
'3817': 085600
'3818': 085691
'3819': 085692
'3820': 085693
'3821': 085787
'3822': 085788
'3823': 085791
'3824': 085792
'3825': 085816
'3826': 085817
'3827': 085822
'3828': 085823
'3829': 085828
'3830': 085831
'3831': 085832
'3832': 085833
'3833': 085834
'3834': 085835
'3835': 085836
'3836': 085837
'3837': 085838
'3838': 085839
'3839': 085840
'3840': 085950
'3841': 085951
'3842': 085952
'3843': 085953
'3844': 085954
'3845': 085955
'3846': 085956
'3847': 085957
'3848': 085963
'3849': 085966
'3850': 085967
'3851': 085968
'3852': 085973
'3853': 086037
'3854': 086038
'3855': 086039
'3856': 086040
'3857': 086077
'3858': 086081
'3859': 086082
'3860': 086116
'3861': 086117
'3862': 086118
'3863': 086119
'3864': 086140
'3865': 086256
'3866': 086259
'3867': 086262
'3868': 086263
'3869': 086415
'3870': 086416
'3871': 086417
'3872': 086419
'3873': 086441
'3874': 086443
'3875': 086481
'3876': 086482
'3877': 086483
'3878': 086484
'3879': 086485
'3880': 086486
'3881': 086487
'3882': 086562
'3883': 086576
'3884': 086623
'3885': 086634
'3886': 086678
'3887': 086679
'3888': 086680
'3889': 086720
'3890': 086721
'3891': 086724
'3892': 086725
'3893': 086730
'3894': 086761
'3895': 086762
'3896': 086763
'3897': 086788
'3898': 086793
'3899': 086795
'3900': 086799
'3901': 086993
'3902': 087068
'3903': 087069
'3904': 087070
'3905': 087096
'3906': 087097
'3907': 087098
'3908': 087099
'3909': 087100
'3910': 087101
'3911': 087102
'3912': 087103
'3913': 087104
'3914': 087105
'3915': 087106
'3916': 087107
'3917': 087108
'3918': 087121
'3919': 087151
'3920': 087152
'3921': 087153
'3922': 087154
'3923': 087155
'3924': 087157
'3925': 087158
'3926': 087159
'3927': 087160
'3928': 087161
'3929': 087185
'3930': 087186
'3931': 087187
'3932': 087188
'3933': 087189
'3934': 087190
'3935': 087191
'3936': 087192
'3937': 087193
'3938': 087194
'3939': 087237
'3940': 087322
'3941': 087323
'3942': 087324
'3943': 087325
'3944': 087361
'3945': 087362
'3946': 087363
'3947': 087377
'3948': 087430
'3949': 087431
'3950': 087490
'3951': 087639
'3952': 087641
'3953': 087642
'3954': 087643
'3955': 087644
'3956': 087645
'3957': 087965
'3958': 087966
'3959': 087967
'3960': 087968
'3961': 087971
'3962': 087972
'3963': 088428
'3964': 088429
'3965': 088485
'3966': 088486
'3967': 088846
'3968': 088848
'3969': 088854
'3970': 088856
'3971': 088858
'3972': 088860
'3973': 088861
'3974': 088863
'3975': 088864
'3976': 088867
'3977': 088868
'3978': 088869
'3979': 088870
'3980': 088871
'3981': 088872
'3982': 088873
'3983': 088874
'3984': 088875
'3985': 088876
'3986': 088877
'3987': 088878
'3988': 088879
'3989': 088892
'3990': 088899
'3991': 088900
'3992': 088959
'3993': 088960
'3994': 089178
'3995': 089179
'3996': 089192
'3997': 089195
'3998': 089196
'3999': 089212
'4000': 089350
'4001': 089376
'4002': 089441
'4003': 089445
'4004': 089447
'4005': 089456
'4006': 089473
'4007': 089474
'4008': 089477
'4009': 089482
'4010': 089484
'4011': 089485
'4012': 089486
'4013': 089639
'4014': 089704
'4015': 089814
'4016': 089815
'4017': 089816
'4018': 089817
'4019': 089841
'4020': 089843
'4021': 089846
'4022': 089847
'4023': 089848
'4024': 089857
'4025': 089859
'4026': 089860
'4027': 089991
'4028': 089992
'4029': 090027
'4030': 090074
'4031': 090278
'4032': 090526
'4033': 090527
'4034': 090529
'4035': 090530
'4036': 090570
'4037': 090579
'4038': 090582
'4039': 090583
'4040': 090587
'4041': 090589
'4042': 090590
'4043': 090591
'4044': 090592
'4045': 090616
'4046': 090617
'4047': 090618
'4048': 090625
'4049': 090639
'4050': 090652
'4051': 090695
'4052': 090804
'4053': 090824
'4054': 090826
'4055': 090828
'4056': 090982
'4057': 090987
'4058': 090993
'4059': 091081
'4060': 091082
'4061': 091083
'4062': 091084
'4063': 091085
'4064': 091086
'4065': 091087
'4066': 091088
'4067': 091089
'4068': 091092
'4069': 091093
'4070': 091098
'4071': 091102
'4072': 091130
'4073': 091157
'4074': 091158
'4075': 091159
'4076': 091160
'4077': 091161
'4078': 091162
'4079': 091163
'4080': 091164
'4081': 091170
'4082': 091177
'4083': 091178
'4084': 091179
'4085': 091181
'4086': 091182
'4087': 091183
'4088': 091184
'4089': 091185
'4090': 091186
'4091': 091187
'4092': 091205
'4093': 091228
'4094': 091238
'4095': 091306
'4096': 091309
'4097': 091312
'4098': 091315
'4099': 091317
'4100': 091318
'4101': 091319
'4102': 091329
'4103': 091349
'4104': 091443
'4105': 091455
'4106': 091458
'4107': 091459
'4108': 091468
'4109': 091471
'4110': 091619
'4111': 091620
'4112': 091621
'4113': 091622
'4114': 091623
'4115': 091624
'4116': 091625
'4117': 091755
'4118': 091788
'4119': 091790
'4120': 091791
'4121': 091793
'4122': 091796
'4123': 091797
'4124': 091851
'4125': 091868
'4126': 091869
'4127': 091894
'4128': 091897
'4129': 091899
'4130': 091900
'4131': 091933
'4132': 091934
'4133': 091936
'4134': 091937
'4135': 091938
'4136': 091958
'4137': 091960
'4138': 092124
'4139': 092125
'4140': 092129
'4141': 092130
'4142': 092131
'4143': 092206
'4144': 092275
'4145': 092282
'4146': 092283
'4147': 092284
'4148': 092292
'4149': 092366
'4150': 092466
'4151': 092508
'4152': 092535
'4153': 092536
'4154': 092538
'4155': 092539
'4156': 092540
'4157': 092546
'4158': 092548
'4159': 092549
'4160': 092551
'4161': 092554
'4162': 092556
'4163': 092561
'4164': 092562
'4165': 092564
'4166': 092565
'4167': 092573
'4168': 092574
'4169': 092868
'4170': 092872
'4171': 092873
'4172': 092874
'4173': 092878
'4174': 092881
'4175': 092885
'4176': 092886
'4177': 092887
'4178': 092888
'4179': 092889
'4180': 092947
'4181': 092948
'4182': 092949
'4183': 092950
'4184': 092951
'4185': 092952
'4186': 092953
'4187': 092954
'4188': 092955
'4189': 093074
'4190': 093075
'4191': 093076
'4192': 093363
'4193': 093364
'4194': 093518
'4195': 093519
'4196': 093520
'4197': 093521
'4198': 093522
'4199': 093523
'4200': 093704
'4201': 093710
'4202': 093712
'4203': 093716
'4204': 093727
'4205': 093867
'4206': 093868
'4207': 093915
'4208': 093917
'4209': 093918
'4210': 093919
'4211': 093920
'4212': 093921
'4213': 093940
'4214': 093941
'4215': 093942
'4216': 093943
'4217': 093944
'4218': 093950
'4219': 093956
'4220': 093981
'4221': 093983
'4222': 093985
'4223': 093986
'4224': 094026
'4225': 094033
'4226': 094034
'4227': 094035
'4228': 094036
'4229': 094037
'4230': 094038
'4231': 094039
'4232': 094093
'4233': 094099
'4234': 094101
'4235': 094102
'4236': 094263
'4237': 094348
'4238': 094411
'4239': 094414
'4240': 094415
'4241': 094419
'4242': 094422
'4243': 094423
'4244': 094426
'4245': 094449
'4246': 094465
'4247': 094467
'4248': 094468
'4249': 094628
'4250': 094630
'4251': 094631
'4252': 094632
'4253': 094634
'4254': 094635
'4255': 094638
'4256': 094803
'4257': 095189
'4258': 095231
'4259': 095248
'4260': 095249
'4261': 095250
'4262': 095251
'4263': 095308
'4264': 095309
'4265': 095310
'4266': 095452
'4267': 095486
'4268': 095506
'4269': 095535
'4270': 095564
'4271': 095722
'4272': 095724
'4273': 095725
'4274': 095726
'4275': 095727
'4276': 095908
'4277': 095910
'4278': 095911
'4279': 095912
'4280': 095914
'4281': 095915
'4282': 096166
'4283': 096167
'4284': 096168
'4285': 096169
'4286': 096399
'4287': 096400
'4288': 096401
'4289': 096402
'4290': 096403
'4291': 096408
'4292': 096560
'4293': 096627
'4294': 096657
'4295': 096675
'4296': 096678
'4297': 096692
'4298': 096693
'4299': 096694
'4300': 096695
'4301': 096696
'4302': 096697
'4303': 096698
'4304': 096699
'4305': 096718
'4306': 096726
'4307': 096728
'4308': 096729
'4309': 096730
'4310': 096731
'4311': 096738
'4312': 096742
'4313': 096743
'4314': 096759
'4315': 096898
'4316': 096900
'4317': 096901
'4318': 096902
'4319': 096935
'4320': 096936
'4321': 096944
'4322': 096945
'4323': 096946
'4324': 097037
'4325': 097041
'4326': 097043
'4327': 097211
'4328': 097215
'4329': 097216
'4330': 097279
'4331': 097283
'4332': 097285
'4333': 097286
'4334': 097373
'4335': 097374
'4336': 097393
'4337': 097404
'4338': 097406
'4339': 097407
'4340': 097424
'4341': 097540
'4342': 097542
'4343': 097544
'4344': 097545
'4345': 097547
'4346': 097548
'4347': 097568
'4348': 097569
'4349': 097570
'4350': 097585
'4351': 097586
'4352': 097587
'4353': 097588
'4354': 097589
'4355': 097590
'4356': 097690
'4357': 097691
'4358': 097692
'4359': 097697
'4360': 097793
'4361': 097794
'4362': 097813
'4363': 097814
'4364': 097841
'4365': 097844
'4366': 097845
'4367': 097846
'4368': 097847
'4369': 097848
'4370': 097886
'4371': 097887
'4372': 097894
'4373': 097940
'4374': 097958
'4375': 097959
'4376': 097960
'4377': 097961
'4378': 097962
'4379': 097980
'4380': 097986
'4381': 097987
'4382': 097988
'4383': 097989
'4384': 098025
'4385': 098026
'4386': 098028
'4387': 098031
'4388': 098077
'4389': 098202
'4390': 098203
'4391': 098204
'4392': 098205
'4393': 098206
'4394': 098227
'4395': 098228
'4396': 098229
'4397': 098235
'4398': 098236
'4399': 098237
'4400': 098238
'4401': 098251
'4402': 098297
'4403': 098298
'4404': 098299
'4405': 098300
'4406': 098301
'4407': 098302
'4408': 098339
'4409': 098346
'4410': 098348
'4411': 098349
'4412': 098547
'4413': 098548
'4414': 098549
'4415': 098550
'4416': 098551
'4417': 098552
'4418': 098553
'4419': 098554
'4420': 098555
'4421': 098556
'4422': 098557
'4423': 098565
'4424': 098567
'4425': 098569
'4426': 098573
'4427': 098574
'4428': 098575
'4429': 098576
'4430': 098577
'4431': 098578
'4432': 098579
'4433': 098580
'4434': 098581
'4435': 098582
'4436': 098583
'4437': 098584
'4438': 098585
'4439': 098613
'4440': 098617
'4441': 098618
'4442': 098619
'4443': 098620
'4444': 098621
'4445': 098622
'4446': 098623
'4447': 098624
'4448': 098625
'4449': 098626
'4450': 098627
'4451': 098628
'4452': 098655
'4453': 098656
'4454': 098657
'4455': 098666
'4456': 098667
'4457': 098668
'4458': 098669
'4459': 098670
'4460': 098671
'4461': 098680
'4462': 098681
'4463': 098701
'4464': 098770
'4465': 098838
'4466': 099041
'4467': 099093
'4468': 099095
'4469': 099096
'4470': 099135
'4471': 099214
'4472': 099260
'4473': 099261
'4474': 099274
'4475': 099311
'4476': 099313
'4477': 099345
'4478': 099361
'4479': 099362
'4480': 099363
'4481': 099364
'4482': 099368
'4483': 099369
'4484': 099370
'4485': 099371
'4486': 099372
'4487': 099373
'4488': 099374
'4489': 099375
'4490': 099389
'4491': 099390
'4492': 099391
'4493': 099392
'4494': 099393
'4495': 099394
'4496': 099395
'4497': 099411
'4498': 099419
'4499': 099436
'4500': 099437
'4501': 099438
'4502': 099439
'4503': 099440
'4504': 099441
'4505': 099442
'4506': 099501
'4507': 099703
'4508': 099704
'4509': 099707
'4510': '100478'
'4511': '100479'
'4512': '100480'
'4513': '100497'
'4514': '100522'
'4515': '100535'
'4516': '100536'
'4517': '100544'
'4518': '100549'
'4519': '100550'
'4520': '100552'
'4521': '100745'
'4522': '100799'
'4523': '100802'
'4524': '100835'
'4525': '100949'
'4526': '100958'
'4527': '100959'
'4528': '100972'
'4529': '100973'
'4530': '100975'
'4531': '100976'
'4532': '101111'
'4533': '101112'
'4534': '101116'
'4535': '101118'
'4536': '101119'
'4537': '101864'
'4538': '101868'
'4539': '101873'
'4540': '101893'
'4541': '101951'
'4542': '102092'
'4543': '102112'
'4544': '102114'
'4545': '102195'
'4546': '103518'
'4547': '103519'
'4548': '103520'
'4549': '103521'
'4550': '103522'
'4551': '103523'
'4552': '103600'
'4553': '103800'
'4554': '103808'
'4555': '104008'
'4556': '104009'
'4557': '104010'
'4558': '104062'
'4559': '104063'
'4560': '104064'
'4561': '104065'
'4562': '104066'
'4563': '104067'
'4564': '104068'
'4565': '104086'
'4566': '104227'
'4567': '104276'
'4568': '104277'
'4569': '104278'
'4570': '104279'
'4571': '104282'
'4572': '104283'
'4573': '104284'
'4574': '104356'
'4575': '104357'
'4576': '104434'
'4577': '104625'
'4578': '104668'
'4579': '104724'
'4580': '104725'
'4581': '104779'
'4582': '104780'
'4583': '105022'
'4584': '105119'
'4585': '105141'
'4586': '105142'
'4587': '105144'
'4588': '105145'
'4589': '105196'
'4590': '105408'
'4591': '105411'
'4592': '105412'
'4593': '105413'
'4594': '105414'
'4595': '105443'
'4596': '105450'
'4597': '105451'
'4598': '105662'
'4599': '105664'
'4600': '105670'
'4601': '105671'
'4602': '105672'
'4603': '105673'
'4604': '105674'
'4605': '105682'
'4606': '105683'
'4607': '105685'
'4608': '105712'
'4609': '105713'
'4610': '105714'
'4611': '105715'
'4612': '105716'
'4613': '105717'
'4614': '105718'
'4615': '105719'
'4616': '105720'
'4617': '105722'
'4618': '105824'
'4619': '105825'
'4620': '105826'
'4621': '105827'
'4622': '105887'
'4623': '105890'
'4624': '105912'
'4625': '105914'
'4626': '105915'
'4627': '105916'
'4628': '105917'
'4629': '105918'
'4630': '105919'
'4631': '105920'
'4632': '106274'
'4633': '106277'
'4634': '106339'
'4635': '106342'
'4636': '106343'
'4637': '106456'
'4638': '106457'
'4639': '106458'
'4640': '106463'
'4641': '106465'
'4642': '106502'
'4643': '106522'
'4644': '106562'
'4645': '106563'
'4646': '106564'
'4647': '106566'
'4648': '106567'
'4649': '106568'
'4650': '106569'
'4651': '106570'
'4652': '106571'
'4653': '106629'
'4654': '106872'
'4655': '106876'
'4656': '106877'
'4657': '106937'
'4658': '106948'
'4659': '106951'
'4660': '106952'
'4661': '106953'
'4662': '106954'
'4663': '106955'
'4664': '106956'
'4665': '107020'
'4666': '107021'
'4667': '107025'
'4668': '107027'
'4669': '107028'
'4670': '107029'
'4671': '107030'
'4672': '107031'
'4673': '107046'
'4674': '107047'
'4675': '107048'
'4676': '107049'
'4677': '107050'
'4678': '107101'
'4679': '107125'
'4680': '107126'
'4681': '107127'
'4682': '107128'
'4683': '107129'
'4684': '107178'
'4685': '107179'
'4686': '107180'
'4687': '107181'
'4688': '107182'
'4689': '107183'
'4690': '107184'
'4691': '107185'
'4692': '107186'
'4693': '107187'
'4694': '107188'
'4695': '107189'
'4696': '107248'
'4697': '107249'
'4698': '107250'
'4699': '107251'
'4700': '107256'
'4701': '107257'
'4702': '107388'
'4703': '107389'
'4704': '107390'
'4705': '107391'
'4706': '107425'
'4707': '107426'
'4708': '107427'
'4709': '107429'
'4710': '107432'
'4711': '107433'
'4712': '107434'
'4713': '107435'
'4714': '107476'
'4715': '107506'
'4716': '107531'
'4717': '107532'
'4718': '107533'
'4719': '107534'
'4720': '107535'
'4721': '107567'
'4722': '107569'
'4723': '107571'
'4724': '107574'
'4725': '107577'
'4726': '107578'
'4727': '107579'
'4728': '107583'
'4729': '107584'
'4730': '107588'
'4731': '107589'
'4732': '107590'
'4733': '107591'
'4734': '107592'
'4735': '107593'
'4736': '107594'
'4737': '107595'
'4738': '107596'
'4739': '107597'
'4740': '107598'
'4741': '107613'
'4742': '107616'
'4743': '107617'
'4744': '107659'
'4745': '107799'
'4746': '107804'
'4747': '107805'
'4748': '107809'
'4749': '107810'
'4750': '107850'
'4751': '107851'
'4752': '107852'
'4753': '107908'
'4754': '107909'
'4755': '107910'
'4756': '107911'
'4757': '107912'
'4758': '107913'
'4759': '107949'
'4760': '107950'
'4761': '107951'
'4762': '107952'
'4763': '107953'
'4764': '107954'
'4765': '107955'
'4766': '107956'
'4767': '107957'
'4768': '108012'
'4769': '108014'
'4770': '108015'
'4771': '108016'
'4772': '108017'
'4773': '108018'
'4774': '108019'
'4775': '108020'
'4776': '108021'
'4777': '108022'
'4778': '108023'
'4779': '108024'
'4780': '108025'
'4781': '108026'
'4782': '108027'
'4783': '108031'
'4784': '108036'
'4785': '108037'
'4786': '108038'
'4787': '108049'
'4788': '108050'
'4789': '108059'
'4790': '108060'
'4791': '108079'
'4792': '108155'
'4793': '108230'
'4794': '108290'
'4795': '108297'
'4796': '108298'
'4797': '108299'
'4798': '108300'
'4799': '108301'
'4800': '108302'
'4801': '108303'
'4802': '108304'
'4803': '108305'
'4804': '108306'
'4805': '108307'
'4806': '108308'
'4807': '108313'
'4808': '108314'
'4809': '108318'
'4810': '108319'
'4811': '108339'
'4812': '108341'
'4813': '108342'
'4814': '108343'
'4815': '108415'
'4816': '108416'
'4817': '108418'
'4818': '108420'
'4819': '108421'
'4820': '108422'
'4821': '108423'
'4822': '108425'
'4823': '108426'
'4824': '108427'
'4825': '108428'
'4826': '108429'
'4827': '108456'
'4828': '108457'
'4829': '108459'
'4830': '108460'
'4831': '108461'
'4832': '108464'
'4833': '108471'
'4834': '108472'
'4835': '108473'
'4836': '108474'
'4837': '108475'
'4838': '108476'
'4839': '108477'
'4840': '108478'
'4841': '108487'
'4842': '108488'
'4843': '108489'
'4844': '108490'
'4845': '108491'
'4846': '108492'
'4847': '108493'
'4848': '108494'
'4849': '108495'
'4850': '108496'
'4851': '108497'
'4852': '108498'
'4853': '108499'
'4854': '108500'
'4855': '108501'
'4856': '108502'
'4857': '108503'
'4858': '108504'
'4859': '108505'
'4860': '108524'
'4861': '108525'
'4862': '108526'
'4863': '108527'
'4864': '108528'
'4865': '108529'
'4866': '108530'
'4867': '108531'
'4868': '108532'
'4869': '108533'
'4870': '108745'
'4871': '108774'
'4872': '108799'
'4873': '108808'
'4874': '108809'
'4875': '108812'
'4876': '108836'
'4877': '108837'
'4878': '108838'
'4879': '108839'
'4880': '108840'
'4881': '108841'
'4882': '108842'
'4883': '108843'
'4884': '108845'
'4885': '108846'
'4886': '108847'
'4887': '108863'
'4888': '108864'
'4889': '108865'
'4890': '108866'
'4891': '108867'
'4892': '108868'
'4893': '108878'
'4894': '108879'
'4895': '108880'
'4896': '108881'
'4897': '108882'
'4898': '108883'
'4899': '108884'
'4900': '108885'
'4901': '108906'
'4902': '108957'
'4903': '108961'
'4904': '108962'
'4905': '108967'
'4906': '108968'
'4907': '108969'
'4908': '108970'
'4909': '108992'
'4910': '109068'
'4911': '109071'
'4912': '109072'
'4913': '109106'
'4914': '109144'
'4915': '109189'
'4916': '109191'
'4917': '109203'
'4918': '109235'
'4919': '109276'
'4920': '109349'
'4921': '109350'
'4922': '109355'
'4923': '109356'
'4924': '109357'
'4925': '109445'
'4926': '109446'
'4927': '109447'
'4928': '109448'
'4929': '109449'
'4930': '109450'
'4931': '109468'
'4932': '109480'
'4933': '109481'
'4934': '109497'
'4935': '109535'
'4936': '109537'
'4937': '109538'
'4938': '109542'
'4939': '109543'
'4940': '109548'
'4941': '109670'
'4942': '109681'
'4943': '109684'
'4944': '109685'
'4945': '109686'
'4946': '109687'
'4947': '109711'
'4948': '109712'
'4949': '109896'
'4950': '109900'
'4951': '109901'
'4952': '109902'
'4953': '109903'
'4954': '109904'
'4955': '109905'
'4956': '109906'
'4957': '109925'
'4958': '109957'
'4959': '109958'
'4960': '109960'
'4961': '109962'
'4962': '109963'
'4963': '109971'
'4964': '109972'
'4965': '109973'
'4966': '109974'
'4967': '109975'
'4968': '109976'
'4969': '109977'
'4970': '109978'
'4971': '110070'
'4972': '110082'
'4973': '110084'
'4974': '110085'
'4975': '110086'
'4976': '110102'
'4977': '110103'
'4978': '110104'
'4979': '110105'
'4980': '110106'
'4981': '110107'
'4982': '110108'
'4983': '110109'
'4984': '110110'
'4985': '110111'
'4986': '110166'
'4987': '110167'
'4988': '110171'
'4989': '110172'
'4990': '110204'
'4991': '110205'
'4992': '110206'
'4993': '110207'
'4994': '110208'
'4995': '110209'
'4996': '110230'
'4997': '110259'
'4998': '110260'
'4999': '110261'
'5000': '110262'
'5001': '110263'
'5002': '110264'
'5003': '110265'
'5004': '110266'
'5005': '110267'
'5006': '110274'
'5007': '110384'
'5008': '110410'
'5009': '110417'
'5010': '110436'
'5011': '110437'
'5012': '110438'
'5013': '110439'
'5014': '110440'
'5015': '110441'
'5016': '110447'
'5017': '110448'
'5018': '110449'
'5019': '110450'
'5020': '110451'
'5021': '110452'
'5022': '110546'
'5023': '110610'
'5024': '110611'
'5025': '110623'
'5026': '110629'
'5027': '110630'
'5028': '110634'
'5029': '110636'
'5030': '110637'
'5031': '110647'
'5032': '110648'
'5033': '110649'
'5034': '110650'
'5035': '110651'
'5036': '110652'
'5037': '110653'
'5038': '110654'
'5039': '110681'
'5040': '110684'
'5041': '110687'
'5042': '110688'
'5043': '110689'
'5044': '110690'
'5045': '110691'
'5046': '110711'
'5047': '110735'
'5048': '110736'
'5049': '110743'
'5050': '110744'
'5051': '110756'
'5052': '110764'
'5053': '110765'
'5054': '110768'
'5055': '110771'
'5056': '110772'
'5057': '110774'
'5058': '110775'
'5059': '110776'
'5060': '110777'
'5061': '110778'
'5062': '110779'
'5063': '110923'
'5064': '110927'
'5065': '110928'
'5066': '110980'
'5067': '110982'
'5068': '110983'
'5069': '110985'
'5070': '111015'
'5071': '111146'
'5072': '111147'
'5073': '111148'
'5074': '111149'
'5075': '111150'
'5076': '111151'
'5077': '111153'
'5078': '111154'
'5079': '111182'
'5080': '111186'
'5081': '111187'
'5082': '111188'
'5083': '111216'
'5084': '111220'
'5085': '111221'
'5086': '111222'
'5087': '111223'
'5088': '111224'
'5089': '111225'
'5090': '111226'
'5091': '111227'
'5092': '111228'
'5093': '111229'
'5094': '111230'
'5095': '111306'
'5096': '111311'
'5097': '111335'
'5098': '111367'
'5099': '111368'
'5100': '111371'
'5101': '111372'
'5102': '111375'
'5103': '111376'
'5104': '111377'
'5105': '111378'
'5106': '111379'
'5107': '111382'
'5108': '111385'
'5109': '111386'
'5110': '111387'
'5111': '111388'
'5112': '111389'
'5113': '111390'
'5114': '111391'
'5115': '111392'
'5116': '111393'
'5117': '111394'
'5118': '111395'
'5119': '111396'
'5120': '111397'
'5121': '111398'
'5122': '111399'
'5123': '111400'
'5124': '111401'
'5125': '111402'
'5126': '111413'
'5127': '111416'
'5128': '111460'
'5129': '111579'
'5130': '111658'
'5131': '111747'
'5132': '111793'
'5133': '111819'
'5134': '111871'
'5135': '111872'
'5136': '111873'
'5137': '111911'
'5138': '111933'
'5139': '111934'
'5140': '111935'
'5141': '111936'
'5142': '111937'
'5143': '111938'
'5144': '111974'
'5145': '111982'
'5146': '111994'
'5147': '112000'
'5148': '112001'
'5149': '112020'
'5150': '112065'
'5151': '112066'
'5152': '112088'
'5153': '112133'
'5154': '112196'
'5155': '112197'
'5156': '112198'
'5157': '112199'
'5158': '112209'
'5159': '112210'
'5160': '112211'
'5161': '112215'
'5162': '112252'
'5163': '112314'
'5164': '112315'
'5165': '112316'
'5166': '112317'
'5167': '112318'
'5168': '112468'
'5169': '112481'
'5170': '112483'
'5171': '112484'
'5172': '112485'
'5173': '112486'
'5174': '112487'
'5175': '112488'
'5176': '112490'
'5177': '112526'
'5178': '112527'
'5179': '112528'
'5180': '112529'
'5181': '112583'
'5182': '112584'
'5183': '112585'
'5184': '112586'
'5185': '112587'
'5186': '112588'
'5187': '112668'
'5188': '112733'
'5189': '112734'
'5190': '112735'
'5191': '112767'
'5192': '112768'
'5193': '112769'
'5194': '112770'
'5195': '112780'
'5196': '112781'
'5197': '112785'
'5198': '112788'
'5199': '112789'
'5200': '112790'
'5201': '112821'
'5202': '112975'
'5203': '112976'
'5204': '112977'
'5205': '112978'
'5206': '113016'
'5207': '113017'
'5208': '113018'
'5209': '113019'
'5210': '113020'
'5211': '113021'
'5212': '113022'
'5213': '113023'
'5214': '113024'
'5215': '113025'
'5216': '113026'
'5217': '113027'
'5218': '113028'
'5219': '113030'
'5220': '113031'
'5221': '113032'
'5222': '113033'
'5223': '113034'
'5224': '113035'
'5225': '113036'
'5226': '113037'
'5227': '113063'
'5228': '113110'
'5229': '113164'
'5230': '113165'
'5231': '113166'
'5232': '113167'
'5233': '113203'
'5234': '113259'
'5235': '113260'
'5236': '113261'
'5237': '113262'
'5238': '113263'
'5239': '113264'
'5240': '113265'
'5241': '113266'
'5242': '113267'
'5243': '113268'
'5244': '113269'
'5245': '113270'
'5246': '113271'
'5247': '113272'
'5248': '113273'
'5249': '113274'
'5250': '113275'
'5251': '113276'
'5252': '113277'
'5253': '113278'
'5254': '113279'
'5255': '113280'
'5256': '113281'
'5257': '113282'
'5258': '113284'
'5259': '113294'
'5260': '113303'
'5261': '113304'
'5262': '113305'
'5263': '113311'
'5264': '113334'
'5265': '113335'
'5266': '113336'
'5267': '113342'
'5268': '113343'
'5269': '113344'
'5270': '113357'
'5271': '113359'
'5272': '113360'
'5273': '113453'
'5274': '113511'
'5275': '113512'
'5276': '113513'
'5277': '113530'
'5278': '113558'
'5279': '113564'
'5280': '113574'
'5281': '113696'
'5282': '113697'
'5283': '113698'
'5284': '113699'
'5285': '113700'
'5286': '113701'
'5287': '113702'
'5288': '113787'
'5289': '113788'
'5290': '113789'
'5291': '113790'
'5292': '113808'
'5293': '113809'
'5294': '113810'
'5295': '113822'
'5296': '113932'
'5297': '113933'
'5298': '113934'
'5299': '113935'
'5300': '113946'
'5301': '113949'
'5302': '113950'
'5303': '113969'
'5304': '113970'
'5305': '113971'
'5306': '113972'
'5307': '113973'
'5308': '114006'
'5309': '114007'
'5310': '114036'
'5311': '114037'
'5312': '114040'
'5313': '114041'
'5314': '114042'
'5315': '114044'
'5316': '114045'
'5317': '114047'
'5318': '114048'
'5319': '114049'
'5320': '114050'
'5321': '114051'
'5322': '114061'
'5323': '114062'
'5324': '114063'
'5325': '114064'
'5326': '114065'
'5327': '114066'
'5328': '114067'
'5329': '114069'
'5330': '114070'
'5331': '114072'
'5332': '114073'
'5333': '114074'
'5334': '114076'
'5335': '114077'
'5336': '114198'
'5337': '114199'
'5338': '114200'
'5339': '114201'
'5340': '114212'
'5341': '114222'
'5342': '114223'
'5343': '114231'
'5344': '114232'
'5345': '114233'
'5346': '114234'
'5347': '114235'
'5348': '114236'
'5349': '114237'
'5350': '114238'
'5351': '114239'
'5352': '114242'
'5353': '114245'
'5354': '114265'
'5355': '114266'
'5356': '114268'
'5357': '114272'
'5358': '114274'
'5359': '114275'
'5360': '114279'
'5361': '114282'
'5362': '114283'
'5363': '114289'
'5364': '114290'
'5365': '114291'
'5366': '114292'
'5367': '114293'
'5368': '114294'
'5369': '114295'
'5370': '114296'
'5371': '114297'
'5372': '114298'
'5373': '114371'
'5374': '114372'
'5375': '114373'
'5376': '114374'
'5377': '114375'
'5378': '114384'
'5379': '114385'
'5380': '114386'
'5381': '114387'
'5382': '114388'
'5383': '114389'
'5384': '114390'
'5385': '114391'
'5386': '114392'
'5387': '114393'
'5388': '114395'
'5389': '114396'
'5390': '114397'
'5391': '114398'
'5392': '114399'
'5393': '114400'
'5394': '114401'
'5395': '114402'
'5396': '114403'
'5397': '114404'
'5398': '114405'
'5399': '114406'
'5400': '114408'
'5401': '114409'
'5402': '114410'
'5403': '114411'
'5404': '114412'
'5405': '114413'
'5406': '114414'
'5407': '114415'
'5408': '114416'
'5409': '114430'
'5410': '114532'
'5411': '114533'
'5412': '114534'
'5413': '114535'
'5414': '114536'
'5415': '114538'
'5416': '114539'
'5417': '114541'
'5418': '114544'
'5419': '114545'
'5420': '114556'
'5421': '114558'
'5422': '114559'
'5423': '114879'
'5424': '114880'
'5425': '114884'
'5426': '114936'
'5427': '114937'
'5428': '114938'
'5429': '114939'
'5430': '114940'
'5431': '114941'
'5432': '114942'
'5433': '114943'
'5434': '114974'
'5435': '114976'
'5436': '115002'
'5437': '115011'
'5438': '115125'
'5439': '115176'
'5440': '115262'
'5441': '115263'
'5442': '115267'
'5443': '115268'
'5444': '115269'
'5445': '115271'
'5446': '115272'
'5447': '115273'
'5448': '115288'
'5449': '115289'
'5450': '115290'
'5451': '115292'
'5452': '115293'
'5453': '115294'
'5454': '115321'
'5455': '115339'
'5456': '115391'
'5457': '115392'
'5458': '115470'
'5459': '115471'
'5460': '115472'
'5461': '115473'
'5462': '115474'
'5463': '115475'
'5464': '115591'
'5465': '115592'
'5466': '115597'
'5467': '115697'
'5468': '115698'
'5469': '115699'
'5470': '115700'
'5471': '115721'
'5472': '115722'
'5473': '115723'
'5474': '115724'
'5475': '115735'
'5476': '115761'
'5477': '115762'
'5478': '115764'
'5479': '115765'
'5480': '115766'
'5481': '115767'
'5482': '115768'
'5483': '115769'
'5484': '115771'
'5485': '115772'
'5486': '115773'
'5487': '115774'
'5488': '115775'
'5489': '115811'
'5490': '115812'
'5491': '115813'
'5492': '115814'
'5493': '115815'
'5494': '115816'
'5495': '115817'
'5496': '115849'
'5497': '115850'
'5498': '115852'
'5499': '115888'
'5500': '115891'
'5501': '115892'
'5502': '115922'
'5503': '115923'
'5504': '115925'
'5505': '115926'
'5506': '115927'
'5507': '115930'
'5508': '115932'
'5509': '115935'
'5510': '115944'
'5511': '115948'
'5512': '116029'
'5513': '116068'
'5514': '116098'
'5515': '116099'
'5516': '116101'
'5517': '116116'
'5518': '116119'
'5519': '116175'
'5520': '116176'
'5521': '116177'
'5522': '116235'
'5523': '116236'
'5524': '116237'
'5525': '116238'
'5526': '116239'
'5527': '116240'
'5528': '116241'
'5529': '116242'
'5530': '116243'
'5531': '116261'
'5532': '116344'
'5533': '116345'
'5534': '116372'
'5535': '116383'
'5536': '116388'
'5537': '116389'
'5538': '116390'
'5539': '116407'
'5540': '116446'
'5541': '116447'
'5542': '116448'
'5543': '116449'
'5544': '116451'
'5545': '116452'
'5546': '116453'
'5547': '116454'
'5548': '116455'
'5549': '116456'
'5550': '116457'
'5551': '116458'
'5552': '116464'
'5553': '116465'
'5554': '116466'
'5555': '116467'
'5556': '116468'
'5557': '116487'
'5558': '116488'
'5559': '116489'
'5560': '116490'
'5561': '116491'
'5562': '116514'
'5563': '116517'
'5564': '116525'
'5565': '116526'
'5566': '116527'
'5567': '116528'
'5568': '116547'
'5569': '116549'
'5570': '116586'
'5571': '116587'
'5572': '116704'
'5573': '116706'
'5574': '116707'
'5575': '116709'
'5576': '116733'
'5577': '116735'
'5578': '116736'
'5579': '116753'
'5580': '116755'
'5581': '116756'
'5582': '116757'
'5583': '116758'
'5584': '116759'
'5585': '116760'
'5586': '116833'
'5587': '116868'
'5588': '116869'
'5589': '116870'
'5590': '116871'
'5591': '116872'
'5592': '116873'
'5593': '116874'
'5594': '116876'
'5595': '116877'
'5596': '116878'
'5597': '116879'
'5598': '116880'
'5599': '116881'
'5600': '116882'
'5601': '116883'
'5602': '117057'
'5603': '117159'
'5604': '117160'
'5605': '117161'
'5606': '117169'
'5607': '117170'
'5608': '117171'
'5609': '117172'
'5610': '117173'
'5611': '117251'
'5612': '117252'
'5613': '117253'
'5614': '117287'
'5615': '117288'
'5616': '117450'
'5617': '117472'
'5618': '117473'
'5619': '117609'
'5620': '117610'
'5621': '117611'
'5622': '117612'
'5623': '117613'
'5624': '117614'
'5625': '117626'
'5626': '117627'
'5627': '117628'
'5628': '117629'
'5629': '117630'
'5630': '117631'
'5631': '117632'
'5632': '117666'
'5633': '117667'
'5634': '117668'
'5635': '117669'
'5636': '117670'
'5637': '117846'
'5638': '117883'
'5639': '117884'
'5640': '117885'
'5641': '117886'
'5642': '117887'
'5643': '117942'
'5644': '117943'
'5645': '117944'
'5646': '117945'
'5647': '117946'
'5648': '117961'
'5649': '117966'
'5650': '117967'
'5651': '117970'
'5652': '117991'
'5653': '118000'
'5654': '118012'
'5655': '118058'
'5656': '118059'
'5657': '118060'
'5658': '118061'
'5659': '118062'
'5660': '118063'
'5661': '118068'
'5662': '118070'
'5663': '118084'
'5664': '118085'
'5665': '118087'
'5666': '118195'
'5667': '118196'
'5668': '118222'
'5669': '118223'
'5670': '118257'
'5671': '118276'
'5672': '118277'
'5673': '118279'
'5674': '118327'
'5675': '118384'
'5676': '118478'
'5677': '118484'
'5678': '118489'
'5679': '118496'
'5680': '118498'
'5681': '118499'
'5682': '118500'
'5683': '118502'
'5684': '118503'
'5685': '118504'
'5686': '118505'
'5687': '118507'
'5688': '118569'
'5689': '118618'
'5690': '118629'
'5691': '118670'
'5692': '118671'
'5693': '118672'
'5694': '118674'
'5695': '118734'
'5696': '118735'
'5697': '118738'
'5698': '118739'
'5699': '118886'
'5700': '118891'
'5701': '118920'
'5702': '118921'
'5703': '118922'
'5704': '118923'
'5705': '118950'
'5706': '118951'
'5707': '118952'
'5708': '118953'
'5709': '118954'
'5710': '118955'
'5711': '118957'
'5712': '118958'
'5713': '118972'
'5714': '118986'
'5715': '118987'
'5716': '118988'
'5717': '119025'
'5718': '119026'
'5719': '119027'
'5720': '119063'
'5721': '119086'
'5722': '119095'
'5723': '119097'
'5724': '119118'
'5725': '119134'
'5726': '119187'
'5727': '119193'
'5728': '119257'
'5729': '119369'
'5730': '119379'
'5731': '119413'
'5732': '119545'
'5733': '119569'
'5734': '119571'
'5735': '119574'
'5736': '119575'
'5737': '119578'
'5738': '119579'
'5739': '119580'
'5740': '119582'
'5741': '119583'
'5742': '119584'
'5743': '119592'
'5744': '119715'
'5745': '119719'
'5746': '119725'
'5747': '119726'
'5748': '119727'
'5749': '119745'
'5750': '119828'
'5751': '119830'
'5752': '119831'
'5753': '119893'
'5754': '119894'
'5755': '119895'
'5756': '119896'
'5757': '119897'
'5758': '119898'
'5759': '119899'
'5760': '119900'
'5761': '119901'
'5762': '119922'
'5763': '119938'
'5764': '119939'
'5765': '119940'
'5766': '119941'
'5767': '119942'
'5768': '119979'
'5769': '119985'
'5770': '119988'
'5771': '119991'
'5772': '119992'
'5773': '119993'
'5774': '119994'
'5775': '120099'
'5776': '120105'
'5777': '120109'
'5778': '120111'
'5779': '120112'
'5780': '120150'
'5781': '120160'
'5782': '120161'
'5783': '120171'
'5784': '120172'
'5785': '120177'
'5786': '120178'
'5787': '120179'
'5788': '120183'
'5789': '120184'
'5790': '120188'
'5791': '120189'
'5792': '120194'
'5793': '120196'
'5794': '120199'
'5795': '120200'
'5796': '120201'
'5797': '120203'
'5798': '120206'
'5799': '120207'
'5800': '120208'
'5801': '120296'
'5802': '120297'
'5803': '120298'
'5804': '120299'
'5805': '120300'
'5806': '120302'
'5807': '120303'
'5808': '120304'
'5809': '120305'
'5810': '120306'
'5811': '120307'
'5812': '120308'
'5813': '120309'
'5814': '120310'
'5815': '120312'
'5816': '120313'
'5817': '120314'
'5818': '120315'
'5819': '120316'
'5820': '120317'
'5821': '120318'
'5822': '120319'
'5823': '120320'
'5824': '120321'
'5825': '120322'
'5826': '120323'
'5827': '120324'
'5828': '120325'
'5829': '120326'
'5830': '120327'
'5831': '120328'
'5832': '120329'
'5833': '120330'
'5834': '120331'
'5835': '120332'
'5836': '120333'
'5837': '120462'
'5838': '120466'
'5839': '120467'
'5840': '120468'
'5841': '120469'
'5842': '120470'
'5843': '120471'
'5844': '120504'
'5845': '120513'
'5846': '120514'
'5847': '120515'
'5848': '120518'
'5849': '120769'
'5850': '120770'
'5851': '120771'
'5852': '120772'
'5853': '120773'
'5854': '120774'
'5855': '120775'
'5856': '120776'
'5857': '120777'
'5858': '120778'
'5859': '120779'
'5860': '120782'
'5861': '121251'
'5862': '121256'
'5863': '121257'
'5864': '121273'
'5865': '121288'
'5866': '121312'
'5867': '121313'
'5868': '121314'
'5869': '121315'
'5870': '121316'
'5871': '121317'
'5872': '121318'
'5873': '121319'
'5874': '121320'
'5875': '121321'
'5876': '121322'
'5877': '121323'
'5878': '121346'
'5879': '121366'
'5880': '121415'
'5881': '121449'
'5882': '121450'
'5883': '121451'
'5884': '121452'
'5885': '121453'
'5886': '121454'
'5887': '121472'
'5888': '121473'
'5889': '121474'
'5890': '121475'
'5891': '121570'
'5892': '121589'
'5893': '121590'
'5894': '121591'
'5895': '121592'
'5896': '121593'
'5897': '121594'
'5898': '121595'
'5899': '121651'
'5900': '121652'
'5901': '121653'
'5902': '121654'
'5903': '121655'
'5904': '121656'
'5905': '121657'
'5906': '121658'
'5907': '121659'
'5908': '121660'
'5909': '121661'
'5910': '121662'
'5911': '121663'
'5912': '121664'
'5913': '121665'
'5914': '121666'
'5915': '121734'
'5916': '121735'
'5917': '121736'
'5918': '121737'
'5919': '121738'
'5920': '121739'
'5921': '121740'
'5922': '121813'
'5923': '121866'
'5924': '121867'
'5925': '121869'
'5926': '121913'
'5927': '121915'
'5928': '121922'
'5929': '121926'
'5930': '121929'
'5931': '121930'
'5932': '121976'
'5933': '121985'
'5934': '121987'
'5935': '121998'
'5936': '122001'
'5937': '122003'
'5938': '122004'
'5939': '122066'
'5940': '122077'
'5941': '122079'
'5942': '122080'
'5943': '122081'
'5944': '122082'
'5945': '122083'
'5946': '122084'
'5947': '122085'
'5948': '122086'
'5949': '122087'
'5950': '122088'
'5951': '122106'
'5952': '122107'
'5953': '122132'
'5954': '122143'
'5955': '122153'
'5956': '122155'
'5957': '122166'
'5958': '122168'
'5959': '122190'
'5960': '122199'
'5961': '122201'
'5962': '122204'
'5963': '122247'
'5964': '122261'
'5965': '122352'
'5966': '122353'
'5967': '122354'
'5968': '122355'
'5969': '122356'
'5970': '122357'
'5971': '122358'
'5972': '122359'
'5973': '122360'
'5974': '122362'
'5975': '122363'
'5976': '122364'
'5977': '122365'
'5978': '122395'
'5979': '122397'
'5980': '122398'
'5981': '122399'
'5982': '122400'
'5983': '122456'
'5984': '122457'
'5985': '122472'
'5986': '122473'
'5987': '122474'
'5988': '122475'
'5989': '122498'
'5990': '122499'
'5991': '122500'
'5992': '122503'
'5993': '122504'
'5994': '122510'
'5995': '122511'
'5996': '122533'
'5997': '122534'
'5998': '122578'
'5999': '122579'
'6000': '122620'
'6001': '122621'
'6002': '122622'
'6003': '122623'
'6004': '122624'
'6005': '122625'
'6006': '122626'
'6007': '122627'
'6008': '122628'
'6009': '122630'
'6010': '122631'
'6011': '122632'
'6012': '122633'
'6013': '122634'
'6014': '122635'
'6015': '122644'
'6016': '122645'
'6017': '122646'
'6018': '122647'
'6019': '122648'
'6020': '122649'
'6021': '122650'
'6022': '122651'
'6023': '122654'
'6024': '122671'
'6025': '122673'
'6026': '122675'
'6027': '122683'
'6028': '122685'
'6029': '122686'
'6030': '122798'
'6031': '122799'
'6032': '122800'
'6033': '122803'
'6034': '122804'
'6035': '122805'
'6036': '122806'
'6037': '122807'
'6038': '122808'
'6039': '122809'
'6040': '122810'
'6041': '122832'
'6042': '122901'
'6043': '122910'
'6044': '122911'
'6045': '122932'
'6046': '122934'
'6047': '122935'
'6048': '122936'
'6049': '122959'
'6050': '122999'
'6051': '123000'
'6052': '123001'
'6053': '123002'
'6054': '123003'
'6055': '123004'
'6056': '123094'
'6057': '123096'
'6058': '123097'
'6059': '123099'
'6060': '123147'
'6061': '123273'
'6062': '123278'
'6063': '123333'
'6064': '123342'
'6065': '123427'
'6066': '123438'
'6067': '123439'
'6068': '123440'
'6069': '123441'
'6070': '123442'
'6071': '123458'
'6072': '123461'
'6073': '123467'
'6074': '123468'
'6075': '123474'
'6076': '123484'
'6077': '123485'
'6078': '123486'
'6079': '123487'
'6080': '123488'
'6081': '123490'
'6082': '123494'
'6083': '123501'
'6084': '123502'
'6085': '123503'
'6086': '123504'
'6087': '123505'
'6088': '123506'
'6089': '123509'
'6090': '123523'
'6091': '123614'
'6092': '123641'
'6093': '123645'
'6094': '123647'
'6095': '123760'
'6096': '123761'
'6097': '123762'
'6098': '123763'
'6099': '123764'
'6100': '123821'
'6101': '123825'
'6102': '123832'
'6103': '123834'
'6104': '123835'
'6105': '123866'
'6106': '123867'
'6107': '123868'
'6108': '123899'
'6109': '123932'
'6110': '123933'
'6111': '123934'
'6112': '123935'
'6113': '123936'
'6114': '123937'
'6115': '123938'
'6116': '123964'
'6117': '123965'
'6118': '123966'
'6119': '123968'
'6120': '123969'
'6121': '123970'
'6122': '123971'
'6123': '123972'
'6124': '123973'
'6125': '123974'
'6126': '123975'
'6127': '123976'
'6128': '123977'
'6129': '123978'
'6130': '123979'
'6131': '123980'
'6132': '123981'
'6133': '123986'
'6134': '124154'
'6135': '124175'
'6136': '124176'
'6137': '124177'
'6138': '124178'
'6139': '124179'
'6140': '124180'
'6141': '124181'
'6142': '124183'
'6143': '124184'
'6144': '124185'
'6145': '124186'
'6146': '124201'
'6147': '124231'
'6148': '124391'
'6149': '124392'
'6150': '124393'
'6151': '124394'
'6152': '124409'
'6153': '124411'
'6154': '124424'
'6155': '124425'
'6156': '124426'
'6157': '124460'
'6158': '124461'
'6159': '124470'
'6160': '124474'
'6161': '124477'
'6162': '124479'
'6163': '124480'
'6164': '124481'
'6165': '124482'
'6166': '124483'
'6167': '124484'
'6168': '124485'
'6169': '124509'
'6170': '124517'
'6171': '124518'
'6172': '124519'
'6173': '124554'
'6174': '124555'
'6175': '124702'
'6176': '124752'
'6177': '124753'
'6178': '124754'
'6179': '124755'
'6180': '124756'
'6181': '124870'
'6182': '124872'
'6183': '124873'
'6184': '124874'
'6185': '124875'
'6186': '124876'
'6187': '124877'
'6188': '124891'
'6189': '124892'
'6190': '124912'
'6191': '124913'
'6192': '124915'
'6193': '124916'
'6194': '124917'
'6195': '124918'
'6196': '124971'
'6197': '124992'
'6198': '124996'
'6199': '125001'
'6200': '125002'
'6201': '125003'
'6202': '125004'
'6203': '125154'
'6204': '125156'
'6205': '125157'
'6206': '125158'
'6207': '125159'
'6208': '125160'
'6209': '125161'
'6210': '125182'
'6211': '125183'
'6212': '125185'
'6213': '125186'
'6214': '125187'
'6215': '125188'
'6216': '125189'
'6217': '125190'
'6218': '125191'
'6219': '125192'
'6220': '125193'
'6221': '125194'
'6222': '125195'
'6223': '125196'
'6224': '125237'
'6225': '125238'
'6226': '125239'
'6227': '125240'
'6228': '125286'
'6229': '125287'
'6230': '125288'
'6231': '125289'
'6232': '125291'
'6233': '125293'
'6234': '125298'
'6235': '125299'
'6236': '125312'
'6237': '125313'
'6238': '125314'
'6239': '125315'
'6240': '125333'
'6241': '125337'
'6242': '125375'
'6243': '125377'
'6244': '125432'
'6245': '125551'
'6246': '125612'
'6247': '125614'
'6248': '125616'
'6249': '125617'
'6250': '125618'
'6251': '125620'
'6252': '125621'
'6253': '125622'
'6254': '125657'
'6255': '125659'
'6256': '125680'
'6257': '125681'
'6258': '125721'
'6259': '125722'
'6260': '125723'
'6261': '125774'
'6262': '125776'
'6263': '125777'
'6264': '125778'
'6265': '125779'
'6266': '125809'
'6267': '125812'
'6268': '125813'
'6269': '125814'
'6270': '125815'
'6271': '125816'
'6272': '125817'
'6273': '125818'
'6274': '125819'
'6275': '125820'
'6276': '125821'
'6277': '125822'
'6278': '125823'
'6279': '125824'
'6280': '125825'
'6281': '125826'
'6282': '125827'
'6283': '125999'
'6284': '126014'
'6285': '126015'
'6286': '126016'
'6287': '126017'
'6288': '126018'
'6289': '126047'
'6290': '126055'
'6291': '126102'
'6292': '126103'
'6293': '126104'
'6294': '126105'
'6295': '126180'
'6296': '126181'
'6297': '126182'
'6298': '126183'
'6299': '126185'
'6300': '126186'
'6301': '126187'
'6302': '126188'
'6303': '126189'
'6304': '126214'
'6305': '126215'
'6306': '126216'
'6307': '126217'
'6308': '126218'
'6309': '126219'
'6310': '126220'
'6311': '126221'
'6312': '126223'
'6313': '126224'
'6314': '126225'
'6315': '126226'
'6316': '126227'
'6317': '126229'
'6318': '126230'
'6319': '126231'
'6320': '126232'
'6321': '126233'
'6322': '126234'
'6323': '126240'
'6324': '126241'
'6325': '126242'
'6326': '126243'
'6327': '126276'
'6328': '126283'
'6329': '126289'
'6330': '126290'
'6331': '126291'
'6332': '126292'
'6333': '126294'
'6334': '126295'
'6335': '126297'
'6336': '126300'
'6337': '126316'
'6338': '126317'
'6339': '126318'
'6340': '126319'
'6341': '126320'
'6342': '126321'
'6343': '126354'
'6344': '126357'
'6345': '126362'
'6346': '126398'
'6347': '126400'
'6348': '126401'
'6349': '126402'
'6350': '126403'
'6351': '126404'
'6352': '126405'
'6353': '126406'
'6354': '126407'
'6355': '126408'
'6356': '126409'
'6357': '126410'
'6358': '126411'
'6359': '126412'
'6360': '126413'
'6361': '126414'
'6362': '126415'
'6363': '126416'
'6364': '126417'
'6365': '126425'
'6366': '126426'
'6367': '126427'
'6368': '126428'
'6369': '126429'
'6370': '126430'
'6371': '126431'
'6372': '126455'
'6373': '126489'
'6374': '126490'
'6375': '126491'
'6376': '126505'
'6377': '126506'
'6378': '126507'
'6379': '126508'
'6380': '126510'
'6381': '126512'
'6382': '126516'
'6383': '126519'
'6384': '126520'
'6385': '126521'
'6386': '126522'
'6387': '126550'
'6388': '126557'
'6389': '126559'
'6390': '126584'
'6391': '126585'
'6392': '126586'
'6393': '126587'
'6394': '126588'
'6395': '126589'
'6396': '126598'
'6397': '126600'
'6398': '126601'
'6399': '126602'
'6400': '126603'
'6401': '126605'
'6402': '126606'
'6403': '126607'
'6404': '126608'
'6405': '126646'
'6406': '126666'
'6407': '126667'
'6408': '126668'
'6409': '126669'
'6410': '126670'
'6411': '126671'
'6412': '126672'
'6413': '126673'
'6414': '126674'
'6415': '126675'
'6416': '126676'
'6417': '126716'
'6418': '126717'
'6419': '126718'
'6420': '126719'
'6421': '126720'
'6422': '126743'
'6423': '126746'
'6424': '126747'
'6425': '126748'
'6426': '126749'
'6427': '126773'
'6428': '126778'
'6429': '126781'
'6430': '126782'
'6431': '126786'
'6432': '126789'
'6433': '126790'
'6434': '126882'
'6435': '126883'
'6436': '126884'
'6437': '126885'
'6438': '126886'
'6439': '126887'
'6440': '126899'
'6441': '126900'
'6442': '126944'
'6443': '126979'
'6444': '127036'
'6445': '127037'
'6446': '127062'
'6447': '127066'
'6448': '127155'
'6449': '127159'
'6450': '127180'
'6451': '127181'
'6452': '127182'
'6453': '127183'
'6454': '127184'
'6455': '127185'
'6456': '127186'
'6457': '127187'
'6458': '127188'
'6459': '127189'
'6460': '127190'
'6461': '127191'
'6462': '127192'
'6463': '127193'
'6464': '127194'
'6465': '127203'
'6466': '127204'
'6467': '127205'
'6468': '127206'
'6469': '127207'
'6470': '127208'
'6471': '127209'
'6472': '127210'
'6473': '127211'
'6474': '127212'
'6475': '127263'
'6476': '127265'
'6477': '127266'
'6478': '127267'
'6479': '127268'
'6480': '127269'
'6481': '127271'
'6482': '127273'
'6483': '127274'
'6484': '127275'
'6485': '127276'
'6486': '127277'
'6487': '127278'
'6488': '127279'
'6489': '127280'
'6490': '127281'
'6491': '127285'
'6492': '127286'
'6493': '127287'
'6494': '127288'
'6495': '127289'
'6496': '127290'
'6497': '127294'
'6498': '127295'
'6499': '127296'
'6500': '127297'
'6501': '127298'
'6502': '127299'
'6503': '127300'
'6504': '127301'
'6505': '127302'
'6506': '127303'
'6507': '127330'
'6508': '127331'
'6509': '127339'
'6510': '127343'
'6511': '127349'
'6512': '127350'
'6513': '127356'
'6514': '127357'
'6515': '127358'
'6516': '127359'
'6517': '127360'
'6518': '127402'
'6519': '127422'
'6520': '127469'
'6521': '127484'
'6522': '127494'
'6523': '127495'
'6524': '127496'
'6525': '127497'
'6526': '127498'
'6527': '127499'
'6528': '127519'
'6529': '127520'
'6530': '127532'
'6531': '127541'
'6532': '127542'
'6533': '127559'
'6534': '127620'
'6535': '127623'
'6536': '127648'
'6537': '127660'
'6538': '127661'
'6539': '127662'
'6540': '127663'
'6541': '127720'
'6542': '127722'
'6543': '127726'
'6544': '127798'
'6545': '127804'
'6546': '127806'
'6547': '127865'
'6548': '127866'
'6549': '127867'
'6550': '127868'
'6551': '127869'
'6552': '127870'
'6553': '127871'
'6554': '127878'
'6555': '127908'
'6556': '127909'
'6557': '127910'
'6558': '127911'
'6559': '127912'
'6560': '127913'
'6561': '127914'
'6562': '127915'
'6563': '127916'
'6564': '127936'
'6565': '127996'
'6566': '128441'
'6567': '128443'
'6568': '128448'
'6569': '128469'
'6570': '128470'
'6571': '128471'
'6572': '128472'
'6573': '128473'
'6574': '128476'
'6575': '128477'
'6576': '128482'
'6577': '128484'
'6578': '128494'
'6579': '128500'
'6580': '128504'
'6581': '128619'
'6582': '128666'
'6583': '128668'
'6584': '128699'
'6585': '128709'
'6586': '128710'
'6587': '128711'
'6588': '128758'
'6589': '128759'
'6590': '128760'
'6591': '128799'
'6592': '128811'
'6593': '128812'
'6594': '128813'
'6595': '128814'
'6596': '128815'
'6597': '128816'
'6598': '128825'
'6599': '128827'
'6600': '128828'
'6601': '128835'
'6602': '128845'
'6603': '128878'
'6604': '128879'
'6605': '128880'
'6606': '128881'
'6607': '128882'
'6608': '128885'
'6609': '128886'
'6610': '128887'
'6611': '128888'
'6612': '128927'
'6613': '128992'
'6614': '129039'
'6615': '129040'
'6616': '129042'
'6617': '129043'
'6618': '129044'
'6619': '129046'
'6620': '129048'
'6621': '129049'
'6622': '129051'
'6623': '129052'
'6624': '129053'
'6625': '129054'
'6626': '129055'
'6627': '129056'
'6628': '129088'
'6629': '129089'
'6630': '129090'
'6631': '129091'
'6632': '129092'
'6633': '129093'
'6634': '129094'
'6635': '129095'
'6636': '129096'
'6637': '129097'
'6638': '129098'
'6639': '129184'
'6640': '129185'
'6641': '129186'
'6642': '129187'
'6643': '129188'
'6644': '129189'
'6645': '129190'
'6646': '129268'
'6647': '129362'
'6648': '129372'
'6649': '129374'
'6650': '129375'
'6651': '129391'
'6652': '129392'
'6653': '129393'
'6654': '129395'
'6655': '129396'
'6656': '129397'
'6657': '129398'
'6658': '129399'
'6659': '129400'
'6660': '129401'
'6661': '129402'
'6662': '129403'
'6663': '129404'
'6664': '129405'
'6665': '129406'
'6666': '129407'
'6667': '129439'
'6668': '129442'
'6669': '129444'
'6670': '129620'
'6671': '129622'
'6672': '129624'
'6673': '129674'
'6674': '129675'
'6675': '129683'
'6676': '129694'
'6677': '129695'
'6678': '129696'
'6679': '129742'
'6680': '129806'
'6681': '129807'
'6682': '129808'
'6683': '129816'
'6684': '129874'
'6685': '129875'
'6686': '129876'
'6687': '129879'
'6688': '129880'
'6689': '129882'
'6690': '129883'
'6691': '129884'
'6692': '129885'
'6693': '129886'
'6694': '129887'
'6695': '129889'
'6696': '129904'
'6697': '129910'
'6698': '129914'
'6699': '129915'
'6700': '129918'
'6701': '129919'
'6702': '129920'
'6703': '129922'
'6704': '129923'
'6705': '129924'
'6706': '129925'
'6707': '129926'
'6708': '129927'
'6709': '129962'
'6710': '129968'
'6711': '129969'
'6712': '129970'
'6713': '129972'
'6714': '129973'
'6715': '129997'
'6716': '130016'
'6717': '130084'
'6718': '130129'
'6719': '130130'
'6720': '130131'
'6721': '130132'
'6722': '130133'
'6723': '130134'
'6724': '130135'
'6725': '130136'
'6726': '130137'
'6727': '130168'
'6728': '130170'
'6729': '130218'
'6730': '130265'
'6731': '130347'
'6732': '130349'
'6733': '130367'
'6734': '130368'
'6735': '130369'
'6736': '130370'
'6737': '130371'
'6738': '130372'
'6739': '130440'
'6740': '130454'
'6741': '130456'
'6742': '130650'
'6743': '130667'
'6744': '130682'
'6745': '130683'
'6746': '130689'
'6747': '130691'
'6748': '130692'
'6749': '130693'
'6750': '130702'
'6751': '130709'
'6752': '130710'
'6753': '130711'
'6754': '130752'
'6755': '130758'
'6756': '130920'
'6757': '130921'
'6758': '130922'
'6759': '130923'
'6760': '130927'
'6761': '130929'
'6762': '130930'
'6763': '130931'
'6764': '130932'
'6765': '130933'
'6766': '130934'
'6767': '130937'
'6768': '130940'
'6769': '130944'
'6770': '130945'
'6771': '130948'
'6772': '130950'
'6773': '130951'
'6774': '130952'
'6775': '130953'
'6776': '130954'
'6777': '130955'
'6778': '130956'
'6779': '130963'
'6780': '130964'
'6781': '130986'
'6782': '130988'
'6783': '130989'
'6784': '130990'
'6785': '130991'
'6786': '130992'
'6787': '130993'
'6788': '131016'
'6789': '131019'
'6790': '131020'
'6791': '131021'
'6792': '131024'
'6793': '131166'
'6794': '131292'
'6795': '131323'
'6796': '131324'
'6797': '131325'
'6798': '131326'
'6799': '131327'
'6800': '131385'
'6801': '131410'
'6802': '131422'
'6803': '131425'
'6804': '131426'
'6805': '131436'
'6806': '131439'
'6807': '131444'
'6808': '131446'
'6809': '131448'
'6810': '131449'
'6811': '131451'
'6812': '131452'
'6813': '131453'
'6814': '131454'
'6815': '131476'
'6816': '131536'
'6817': '131540'
'6818': '131552'
'6819': '131553'
'6820': '131554'
'6821': '131567'
'6822': '131624'
'6823': '131656'
'6824': '131657'
'6825': '131658'
'6826': '131764'
'6827': '131767'
'6828': '131770'
'6829': '131771'
'6830': '131772'
'6831': '131773'
'6832': '131774'
'6833': '131787'
'6834': '131789'
'6835': '131791'
'6836': '131792'
'6837': '131794'
'6838': '131795'
'6839': '131796'
'6840': '131797'
'6841': '131837'
'6842': '131897'
'6843': '131899'
'6844': '131900'
'6845': '131901'
'6846': '131902'
'6847': '131903'
'6848': '131904'
'6849': '131911'
'6850': '131912'
'6851': '131913'
'6852': '131914'
'6853': '131917'
'6854': '131918'
'6855': '131919'
'6856': '131922'
'6857': '131923'
'6858': '131924'
'6859': '131925'
'6860': '131932'
'6861': '131933'
'6862': '131934'
'6863': '131935'
'6864': '131936'
'6865': '131938'
'6866': '131939'
'6867': '131940'
'6868': '131941'
'6869': '131942'
'6870': '131950'
'6871': '131951'
'6872': '131952'
'6873': '131953'
'6874': '131978'
'6875': '131979'
'6876': '131980'
'6877': '131982'
'6878': '131983'
'6879': '131984'
'6880': '131985'
'6881': '131986'
'6882': '132019'
'6883': '132040'
'6884': '132041'
'6885': '132042'
'6886': '132045'
'6887': '132117'
'6888': '132118'
'6889': '132122'
'6890': '132134'
'6891': '132138'
'6892': '132139'
'6893': '132140'
'6894': '132141'
'6895': '132142'
'6896': '132171'
'6897': '132272'
'6898': '132310'
'6899': '132420'
'6900': '132424'
'6901': '132434'
'6902': '132436'
'6903': '132448'
'6904': '132449'
'6905': '132453'
'6906': '132454'
'6907': '132455'
'6908': '132456'
'6909': '132561'
'6910': '132566'
'6911': '132567'
'6912': '132568'
'6913': '132589'
'6914': '132675'
'6915': '132677'
'6916': '132678'
'6917': '132679'
'6918': '132773'
'6919': '132774'
'6920': '132775'
'6921': '132778'
'6922': '132779'
'6923': '132781'
'6924': '132784'
'6925': '132786'
'6926': '132787'
'6927': '132788'
'6928': '132789'
'6929': '132790'
'6930': '132791'
'6931': '132792'
'6932': '132793'
'6933': '132794'
'6934': '132795'
'6935': '132914'
'6936': '132954'
'6937': '132961'
'6938': '132962'
'6939': '132963'
'6940': '132964'
'6941': '132965'
'6942': '133015'
'6943': '133016'
'6944': '133019'
'6945': '133020'
'6946': '133022'
'6947': '133023'
'6948': '133024'
'6949': '133025'
'6950': '133026'
'6951': '133027'
'6952': '133028'
'6953': '133029'
'6954': '133100'
'6955': '133102'
'6956': '133272'
'6957': '133273'
'6958': '133274'
'6959': '133275'
'6960': '133276'
'6961': '133293'
'6962': '133294'
'6963': '133332'
'6964': '133333'
'6965': '133431'
'6966': '133432'
'6967': '133433'
'6968': '133434'
'6969': '133435'
'6970': '133436'
'6971': '133437'
'6972': '133438'
'6973': '133439'
'6974': '133440'
'6975': '133441'
'6976': '133442'
'6977': '133443'
'6978': '133444'
'6979': '133445'
'6980': '133446'
'6981': '133447'
'6982': '133448'
'6983': '133449'
'6984': '133450'
'6985': '133451'
'6986': '133452'
'6987': '133453'
'6988': '133454'
'6989': '133455'
'6990': '133456'
'6991': '133457'
'6992': '133459'
'6993': '133479'
'6994': '133535'
'6995': '133537'
'6996': '133538'
'6997': '133544'
'6998': '133545'
'6999': '133546'
'7000': '133551'
'7001': '133553'
'7002': '133560'
'7003': '133561'
'7004': '133562'
'7005': '133563'
'7006': '133564'
'7007': '133567'
'7008': '133571'
'7009': '133572'
'7010': '133573'
'7011': '133574'
'7012': '133576'
'7013': '133579'
'7014': '133580'
'7015': '133632'
'7016': '133638'
'7017': '133639'
'7018': '133681'
'7019': '133729'
'7020': '133731'
'7021': '133770'
'7022': '133772'
'7023': '133780'
'7024': '133781'
'7025': '133788'
'7026': '133793'
'7027': '133798'
'7028': '133802'
'7029': '133803'
'7030': '133833'
'7031': '133835'
'7032': '133836'
'7033': '133837'
'7034': '133838'
'7035': '133916'
'7036': '133942'
'7037': '133943'
'7038': '133967'
'7039': '133968'
'7040': '133969'
'7041': '133970'
'7042': '133971'
'7043': '133972'
'7044': '133973'
'7045': '133974'
'7046': '133975'
'7047': '133976'
'7048': '133977'
'7049': '133978'
'7050': '134034'
'7051': '134052'
'7052': '134053'
'7053': '134054'
'7054': '134073'
'7055': '134077'
'7056': '134084'
'7057': '134094'
'7058': '134359'
'7059': '134384'
'7060': '134385'
'7061': '134388'
'7062': '134389'
'7063': '134443'
'7064': '134444'
'7065': '134445'
'7066': '134446'
'7067': '134447'
'7068': '134448'
'7069': '134449'
'7070': '134452'
'7071': '134453'
'7072': '134454'
'7073': '134455'
'7074': '134486'
'7075': '134509'
'7076': '134510'
'7077': '134580'
'7078': '134586'
'7079': '134594'
'7080': '134610'
'7081': '134631'
'7082': '134643'
'7083': '134790'
'7084': '134791'
'7085': '134792'
'7086': '134793'
'7087': '134794'
'7088': '134795'
'7089': '134796'
'7090': '134797'
'7091': '134801'
'7092': '134823'
'7093': '134824'
'7094': '134825'
'7095': '134826'
'7096': '134827'
'7097': '134918'
'7098': '134919'
'7099': '134922'
'7100': '134923'
'7101': '134928'
'7102': '134929'
'7103': '134930'
'7104': '134931'
'7105': '134932'
'7106': '134933'
'7107': '134934'
'7108': '134935'
'7109': '134936'
'7110': '134937'
'7111': '134938'
'7112': '134939'
'7113': '134940'
'7114': '134941'
'7115': '134942'
'7116': '134943'
'7117': '134947'
'7118': '134948'
'7119': '134949'
'7120': '134950'
'7121': '134951'
'7122': '134952'
'7123': '134956'
'7124': '134959'
'7125': '134962'
'7126': '134979'
'7127': '134981'
'7128': '135010'
'7129': '135028'
'7130': '135039'
'7131': '135043'
'7132': '135044'
'7133': '135054'
'7134': '135089'
'7135': '135091'
'7136': '135092'
'7137': '135219'
'7138': '135220'
'7139': '135221'
'7140': '135222'
'7141': '135223'
'7142': '135224'
'7143': '135225'
'7144': '135226'
'7145': '135227'
'7146': '135228'
'7147': '135229'
'7148': '135336'
'7149': '135337'
'7150': '135338'
'7151': '135339'
'7152': '135340'
'7153': '135341'
'7154': '135342'
'7155': '135363'
'7156': '135364'
'7157': '135365'
'7158': '135368'
'7159': '135369'
'7160': '135370'
'7161': '135371'
'7162': '135372'
'7163': '135373'
'7164': '135374'
'7165': '135375'
'7166': '135986'
'7167': '135989'
'7168': '135990'
'7169': '136054'
'7170': '136091'
'7171': '136094'
'7172': '136134'
'7173': '136137'
'7174': '136138'
'7175': '136275'
'7176': '136276'
'7177': '136320'
'7178': '136321'
'7179': '136322'
'7180': '136323'
'7181': '136324'
'7182': '136331'
'7183': '136404'
'7184': '136424'
'7185': '136449'
'7186': '136465'
'7187': '136466'
'7188': '136467'
'7189': '136468'
'7190': '136469'
'7191': '136705'
'7192': '136706'
'7193': '136707'
'7194': '136708'
'7195': '136709'
'7196': '136928'
'7197': '136994'
'7198': '136995'
'7199': '137054'
'7200': '137151'
'7201': '137152'
'7202': '137166'
'7203': '137167'
'7204': '137168'
'7205': '137169'
'7206': '137170'
'7207': '137171'
'7208': '137172'
'7209': '137173'
'7210': '137174'
'7211': '137175'
'7212': '137176'
'7213': '137211'
'7214': '137212'
'7215': '137213'
'7216': '137214'
'7217': '137356'
'7218': '137417'
'7219': '137418'
'7220': '137419'
'7221': '137423'
'7222': '137424'
'7223': '137425'
'7224': '137426'
'7225': '137462'
'7226': '137463'
'7227': '137484'
'7228': '137500'
'7229': '137551'
'7230': '137561'
'7231': '137563'
'7232': '137567'
'7233': '137593'
'7234': '137605'
'7235': '137624'
'7236': '137627'
'7237': '137630'
'7238': '137631'
'7239': '137632'
'7240': '137715'
'7241': '137716'
'7242': '137717'
'7243': '137719'
'7244': '137720'
'7245': '137721'
'7246': '137722'
'7247': '137723'
'7248': '137724'
'7249': '137725'
'7250': '137740'
'7251': '137895'
'7252': '137896'
'7253': '137898'
'7254': '137899'
'7255': '137900'
'7256': '137901'
'7257': '137907'
'7258': '137935'
'7259': '137990'
'7260': '137998'
'7261': '138010'
'7262': '138015'
'7263': '138016'
'7264': '138017'
'7265': '138018'
'7266': '138019'
'7267': '138020'
'7268': '138021'
'7269': '138022'
'7270': '138023'
'7271': '138024'
'7272': '138025'
'7273': '138026'
'7274': '138038'
'7275': '138039'
'7276': '138040'
'7277': '138041'
'7278': '138053'
'7279': '138060'
'7280': '138061'
'7281': '138062'
'7282': '138063'
'7283': '138064'
'7284': '138065'
'7285': '138066'
'7286': '138067'
'7287': '138068'
'7288': '138069'
'7289': '138070'
'7290': '138071'
'7291': '138207'
'7292': '138210'
'7293': '138211'
'7294': '138212'
'7295': '138213'
'7296': '138215'
'7297': '138216'
'7298': '138217'
'7299': '138218'
'7300': '138256'
'7301': '138282'
'7302': '138306'
'7303': '138311'
'7304': '138317'
'7305': '138318'
'7306': '138319'
'7307': '138320'
'7308': '138351'
'7309': '138355'
'7310': '138406'
'7311': '138410'
'7312': '138413'
'7313': '138414'
'7314': '138415'
'7315': '138416'
'7316': '138578'
'7317': '138579'
'7318': '138580'
'7319': '138581'
'7320': '139003'
'7321': '139043'
'7322': '139110'
'7323': '139112'
'7324': '139117'
'7325': '139123'
'7326': '139226'
'7327': '139329'
'7328': '139330'
'7329': '139461'
'7330': '139485'
'7331': '139491'
'7332': '139520'
'7333': '139521'
'7334': '139522'
'7335': '139523'
'7336': '139524'
'7337': '139532'
'7338': '139534'
'7339': '139536'
'7340': '139537'
'7341': '139637'
'7342': '139638'
'7343': '139663'
'7344': '139681'
'7345': '139687'
'7346': '139688'
'7347': '139769'
'7348': '139770'
'7349': '139771'
'7350': '139772'
'7351': '139773'
'7352': '139774'
'7353': '139775'
'7354': '139776'
'7355': '139777'
'7356': '139804'
'7357': '139862'
'7358': '139876'
'7359': '139933'
'7360': '139934'
'7361': '139935'
'7362': '139936'
'7363': '139937'
'7364': '139954'
'7365': '139990'
'7366': '139991'
'7367': '139992'
'7368': '139993'
'7369': '139994'
'7370': '139995'
'7371': '140043'
'7372': '140134'
'7373': '140135'
'7374': '140258'
'7375': '140259'
'7376': '140260'
'7377': '140261'
'7378': '140262'
'7379': '140263'
'7380': '140266'
'7381': '140316'
'7382': '140344'
'7383': '140421'
'7384': '140564'
'7385': '140565'
'7386': '140566'
'7387': '140576'
'7388': '140583'
'7389': '140584'
'7390': '140609'
'7391': '140620'
'7392': '140621'
'7393': '140623'
'7394': '140625'
'7395': '140626'
'7396': '140788'
'7397': '140789'
'7398': '140790'
'7399': '140791'
'7400': '140794'
'7401': '140871'
'7402': '140872'
'7403': '140873'
'7404': '140874'
'7405': '140875'
'7406': '140922'
'7407': '140923'
'7408': '140924'
'7409': '140925'
'7410': '140926'
'7411': '140933'
'7412': '140934'
'7413': '140935'
'7414': '140939'
'7415': '141074'
'7416': '141137'
'7417': '141139'
'7418': '141141'
'7419': '141143'
'7420': '141144'
'7421': '141164'
'7422': '141166'
'7423': '141167'
'7424': '141168'
'7425': '141173'
'7426': '141179'
'7427': '141180'
'7428': '141181'
'7429': '141182'
'7430': '141264'
'7431': '141282'
'7432': '141283'
'7433': '141284'
'7434': '141285'
'7435': '141286'
'7436': '141287'
'7437': '141288'
'7438': '141289'
'7439': '141290'
'7440': '141291'
'7441': '141292'
'7442': '141293'
'7443': '141295'
'7444': '141296'
'7445': '141297'
'7446': '141299'
'7447': '141300'
'7448': '141303'
'7449': '141304'
'7450': '141310'
'7451': '141375'
'7452': '141561'
'7453': '141562'
'7454': '141564'
'7455': '141566'
'7456': '141567'
'7457': '141568'
'7458': '141569'
'7459': '141590'
'7460': '141591'
'7461': '141592'
'7462': '141593'
'7463': '141594'
'7464': '141616'
'7465': '141617'
'7466': '141618'
'7467': '141619'
'7468': '141735'
'7469': '141873'
'7470': '141874'
'7471': '141875'
'7472': '141876'
'7473': '141877'
'7474': '141878'
'7475': '141894'
'7476': '141901'
'7477': '141902'
'7478': '141903'
'7479': '141972'
'7480': '142078'
'7481': '142079'
'7482': '142080'
'7483': '142081'
'7484': '142082'
'7485': '142083'
'7486': '142084'
'7487': '142085'
'7488': '142086'
'7489': '142087'
'7490': '142088'
'7491': '142089'
'7492': '142091'
'7493': '142092'
'7494': '142093'
'7495': '142094'
'7496': '142096'
'7497': '142097'
'7498': '142098'
'7499': '142128'
'7500': '142129'
'7501': '142132'
'7502': '142133'
'7503': '142358'
'7504': '142359'
'7505': '142360'
'7506': '142361'
'7507': '142362'
'7508': '142381'
'7509': '142402'
'7510': '142418'
'7511': '142433'
'7512': '142511'
'7513': '142516'
'7514': '142517'
'7515': '142519'
'7516': '142528'
'7517': '142529'
'7518': '142530'
'7519': '142531'
'7520': '142532'
'7521': '142533'
'7522': '142534'
'7523': '142535'
'7524': '142536'
'7525': '142537'
'7526': '142538'
'7527': '142539'
'7528': '142549'
'7529': '142550'
'7530': '142551'
'7531': '142552'
'7532': '142553'
'7533': '142563'
'7534': '142564'
'7535': '142565'
'7536': '142566'
'7537': '142567'
'7538': '142568'
'7539': '142569'
'7540': '142570'
'7541': '142571'
'7542': '142572'
'7543': '142573'
'7544': '142574'
'7545': '142575'
'7546': '142576'
'7547': '142577'
'7548': '142579'
'7549': '142641'
'7550': '142666'
'7551': '142668'
'7552': '142669'
'7553': '142670'
'7554': '142671'
'7555': '142672'
'7556': '142947'
'7557': '142948'
'7558': '142949'
'7559': '142950'
'7560': '143039'
'7561': '143046'
'7562': '143055'
'7563': '143056'
'7564': '143057'
'7565': '143058'
'7566': '143059'
'7567': '143060'
'7568': '143061'
'7569': '143095'
'7570': '143097'
'7571': '143098'
'7572': '143099'
'7573': '143106'
'7574': '143186'
'7575': '143214'
'7576': '143215'
'7577': '143216'
'7578': '143217'
'7579': '143218'
'7580': '143219'
'7581': '143220'
'7582': '143221'
'7583': '143237'
'7584': '143239'
'7585': '143290'
'7586': '143295'
'7587': '143296'
'7588': '143299'
'7589': '143300'
'7590': '143303'
'7591': '143304'
'7592': '143305'
'7593': '143306'
'7594': '143307'
'7595': '143308'
'7596': '143309'
'7597': '143318'
'7598': '143319'
'7599': '143532'
'7600': '143941'
'7601': '143989'
'7602': '143995'
'7603': '144170'
'7604': '144171'
'7605': '144172'
'7606': '144173'
'7607': '144179'
'7608': '144180'
'7609': '144181'
'7610': '144182'
'7611': '144212'
'7612': '144213'
'7613': '144214'
'7614': '144215'
'7615': '144216'
'7616': '144423'
'7617': '144424'
'7618': '144454'
'7619': '144465'
'7620': '144466'
'7621': '144467'
'7622': '144468'
'7623': '144469'
'7624': '144470'
'7625': '144471'
'7626': '144472'
'7627': '144473'
'7628': '144474'
'7629': '144475'
'7630': '144476'
'7631': '144477'
'7632': '144487'
'7633': '144492'
'7634': '144542'
'7635': '144543'
'7636': '144544'
'7637': '144545'
'7638': '144546'
'7639': '144547'
'7640': '144548'
'7641': '144549'
'7642': '144550'
'7643': '144551'
'7644': '144552'
'7645': '144587'
'7646': '144592'
'7647': '144600'
'7648': '144733'
'7649': '144740'
'7650': '144741'
'7651': '144801'
'7652': '144809'
'7653': '144810'
'7654': '144933'
'7655': '144934'
'7656': '144935'
'7657': '144936'
'7658': '144937'
'7659': '144938'
'7660': '144939'
'7661': '144940'
'7662': '144941'
'7663': '144942'
'7664': '144943'
'7665': '144944'
'7666': '144945'
'7667': '144946'
'7668': '145002'
'7669': '145003'
'7670': '145004'
'7671': '145005'
'7672': '145020'
'7673': '145027'
'7674': '145041'
'7675': '145042'
'7676': '145043'
'7677': '145058'
'7678': '145059'
'7679': '145067'
'7680': '145068'
'7681': '145074'
'7682': '145183'
'7683': '145189'
'7684': '145199'
'7685': '145241'
'7686': '145257'
'7687': '145258'
'7688': '145259'
'7689': '145260'
'7690': '145431'
'7691': '145432'
'7692': '145457'
'7693': '145458'
'7694': '145462'
'7695': '145464'
'7696': '145475'
'7697': '145476'
'7698': '145477'
'7699': '145549'
'7700': '145550'
'7701': '145551'
'7702': '145552'
'7703': '145553'
'7704': '145554'
'7705': '145555'
'7706': '145556'
'7707': '145606'
'7708': '145607'
'7709': '145608'
'7710': '145609'
'7711': '145610'
'7712': '145645'
'7713': '145646'
'7714': '145653'
'7715': '145702'
'7716': '145703'
'7717': '145704'
'7718': '145705'
'7719': '145706'
'7720': '145707'
'7721': '145708'
'7722': '145709'
'7723': '145710'
'7724': '145711'
'7725': '145724'
'7726': '145727'
'7727': '145728'
'7728': '145729'
'7729': '145730'
'7730': '145741'
'7731': '145742'
'7732': '145743'
'7733': '145744'
'7734': '145745'
'7735': '145746'
'7736': '145747'
'7737': '145748'
'7738': '145749'
'7739': '145750'
'7740': '145751'
'7741': '145752'
'7742': '145754'
'7743': '145755'
'7744': '145756'
'7745': '145757'
'7746': '145758'
'7747': '145759'
'7748': '145760'
'7749': '145761'
'7750': '145762'
'7751': '145777'
'7752': '145780'
'7753': '145783'
'7754': '145887'
'7755': '145917'
'7756': '145918'
'7757': '146017'
'7758': '146018'
'7759': '146019'
'7760': '146020'
'7761': '146070'
'7762': '146147'
'7763': '146148'
'7764': '146149'
'7765': '146150'
'7766': '146151'
'7767': '146152'
'7768': '146153'
'7769': '146343'
'7770': '146458'
'7771': '146478'
'7772': '146481'
'7773': '146482'
'7774': '146483'
'7775': '146639'
'7776': '146681'
'7777': '146683'
'7778': '146685'
'7779': '146687'
'7780': '146689'
'7781': '146713'
'7782': '146716'
'7783': '146724'
'7784': '146725'
'7785': '146726'
'7786': '146727'
'7787': '146879'
'7788': '146961'
'7789': '146968'
'7790': '146969'
'7791': '146970'
'7792': '146988'
'7793': '146989'
'7794': '147020'
'7795': '147021'
'7796': '147022'
'7797': '147023'
'7798': '147024'
'7799': '147059'
'7800': '147085'
'7801': '147086'
'7802': '147087'
'7803': '147126'
'7804': '147191'
'7805': '147261'
'7806': '147265'
'7807': '147267'
'7808': '147268'
'7809': '147269'
'7810': '147295'
'7811': '147309'
'7812': '147409'
'7813': '147412'
'7814': '147413'
'7815': '147780'
'7816': '147815'
'7817': '147886'
'7818': '147956'
'7819': '148002'
'7820': '148028'
'7821': '148031'
'7822': '148032'
'7823': '148066'
'7824': '148070'
'7825': '148074'
'7826': '148075'
'7827': '148076'
'7828': '148077'
'7829': '148078'
'7830': '148079'
'7831': '148082'
'7832': '148099'
'7833': '148112'
'7834': '148113'
'7835': '148114'
'7836': '148120'
'7837': '148121'
'7838': '148124'
'7839': '148130'
'7840': '148131'
'7841': '148132'
'7842': '148133'
'7843': '148168'
'7844': '148186'
'7845': '148187'
'7846': '148190'
'7847': '148208'
'7848': '148210'
'7849': '148211'
'7850': '148212'
'7851': '148213'
'7852': '148214'
'7853': '148215'
'7854': '148216'
'7855': '148217'
'7856': '148218'
'7857': '148231'
'7858': '148233'
'7859': '148234'
'7860': '148235'
'7861': '148246'
'7862': '148285'
'7863': '148286'
'7864': '148287'
'7865': '148288'
'7866': '148289'
'7867': '148290'
'7868': '148302'
'7869': '148303'
'7870': '148305'
'7871': '148429'
'7872': '148430'
'7873': '148439'
'7874': '148441'
'7875': '148443'
'7876': '148444'
'7877': '148510'
'7878': '148513'
'7879': '148514'
'7880': '148516'
'7881': '148517'
'7882': '148518'
'7883': '148519'
'7884': '148532'
'7885': '148535'
'7886': '148536'
'7887': '148537'
'7888': '148584'
'7889': '148585'
'7890': '148586'
'7891': '148587'
'7892': '148602'
'7893': '148603'
'7894': '148604'
'7895': '148605'
'7896': '148606'
'7897': '148607'
'7898': '148608'
'7899': '148609'
'7900': '148610'
'7901': '148611'
'7902': '148612'
'7903': '148613'
'7904': '148773'
'7905': '149075'
'7906': '149078'
'7907': '149082'
'7908': '149083'
'7909': '149099'
'7910': '149100'
'7911': '149101'
'7912': '149102'
'7913': '149103'
'7914': '149118'
'7915': '149124'
'7916': '149138'
'7917': '149139'
'7918': '149140'
'7919': '149141'
'7920': '149142'
'7921': '149143'
'7922': '149185'
'7923': '149369'
'7924': '149370'
'7925': '149416'
'7926': '149417'
'7927': '149422'
'7928': '149452'
'7929': '149488'
'7930': '149523'
'7931': '149623'
'7932': '149625'
'7933': '149626'
'7934': '149687'
'7935': '149689'
'7936': '149690'
'7937': '149700'
'7938': '149701'
'7939': '149712'
'7940': '149714'
'7941': '149727'
'7942': '149750'
'7943': '149775'
'7944': '149776'
'7945': '149777'
'7946': '149778'
'7947': '149842'
'7948': '149951'
'7949': '149953'
'7950': '150015'
'7951': '150017'
'7952': '150018'
'7953': '150062'
'7954': '150063'
'7955': '150064'
'7956': '150073'
'7957': '150078'
'7958': '150079'
'7959': '150080'
'7960': '150265'
'7961': '150266'
'7962': '150267'
'7963': '150268'
'7964': '150287'
'7965': '150288'
'7966': '151404'
'7967': '152103'
'7968': '152253'
'7969': '152254'
'7970': '152258'
'7971': '152261'
'7972': '152262'
'7973': '152324'
'7974': '152418'
'7975': '152425'
'7976': '152480'
'7977': '152543'
'7978': '152545'
'7979': '152568'
'7980': '152569'
'7981': '152570'
'7982': '153337'
'7983': '153383'
'7984': '153452'
'7985': '153946'
'7986': '153955'
'7987': '153956'
'7988': '154303'
'7989': '154305'
'7990': '154306'
'7991': '154307'
'7992': '154308'
'7993': '154309'
'7994': '154413'
'7995': '154414'
'7996': '155066'
- name: siam_cluster
dtype: int64
splits:
- name: train
num_bytes: 432493657.0
num_examples: 7997
download_size: 432737205
dataset_size: 432493657.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vinisf/voz2 | ---
license: openrail
---
|
Sangeetha/Kaggle-LLM-Science-Exam | ---
license: apache-2.0
---
# Dataset Card for [LLM Science Exam Kaggle Competition]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
https://www.kaggle.com/competitions/kaggle-llm-science-exam/data
### Languages
[en, de, tl, it, es, fr, pt, id, pl, ro, so, ca, da, sw, hu, no, nl, et, af, hr, lv, sl]
## Dataset Structure
Columns
prompt - the text of the question being asked
A - option A; if this option is correct, then answer will be A
B - option B; if this option is correct, then answer will be B
C - option C; if this option is correct, then answer will be C
D - option D; if this option is correct, then answer will be D
E - option E; if this option is correct, then answer will be E
answer - the most correct answer, as defined by the generating LLM (one of A, B, C, D, or E).
### Data Fields
[Prompt, Options, Answer]
### Data Splits
Train: 6684 rowa
## Dataset Creation
All credits to Competition organizers. To answer difficult science-based questions written by a Large Language Model.
#### Who are the source language producers?
https://www.kaggle.com/competitions/kaggle-llm-science-exam/overview
gpt3.5 clocks in at 175 billion parameters generated dataset
### Citation Information
All credist to: https://www.kaggle.com/competitions/kaggle-llm-science-exam/overview and competiton participants who posted the curation dataset
### Contributions
Kaggle - LLM Science Exam Contributors
|
delphiclinic/flaggedImages | ---
language:
- en
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- zero-shot-classification
pretty_name: consult
configs:
- config_name: default
data_files:
- split: train
path: data.csv
tags:
- medical
---
|
NativeFunction/housing | ---
license: mit
dataset_info:
features:
- name: longitude
dtype: float64
- name: latitude
dtype: float64
- name: housing_median_age
dtype: float64
- name: total_rooms
dtype: float64
- name: total_bedrooms
dtype: float64
- name: population
dtype: float64
- name: households
dtype: float64
- name: median_income
dtype: float64
- name: median_house_value
dtype: float64
- name: ocean_proximity
dtype: string
splits:
- name: train
num_bytes: 1737680
num_examples: 20640
download_size: 824144
dataset_size: 1737680
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
workitos/SD_Anime_Characters_Repository | ---
license: unknown
---
|
premio-ai/TheArabicPile_Conversational | ---
language:
- ar
license: cc-by-nc-4.0
task_categories:
- text-generation
dataset_info:
- config_name: dedup
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2074285191
num_examples: 1189978
download_size: 1106103903
dataset_size: 2074285191
- config_name: default
features:
- name: text
dtype: string
splits:
- name: original
num_bytes: 2180193661
num_examples: 1303453
download_size: 1168365713
dataset_size: 2180193661
configs:
- config_name: dedup
data_files:
- split: train
path: dedup/train-*
- config_name: default
data_files:
- split: original
path: data/train-*
---
# The Arabic Pile

## Introduction:
The Arabic Pile is a comprehensive dataset meticulously designed to parallel the structure of The Pile and The Nordic Pile. Focused on the Arabic language, the dataset encompasses a vast array of linguistic nuances, incorporating both Modern Standard Arabic (MSA) and various Levantine, North African, and Egyptian dialects. Tailored for the training and fine-tuning of large language models, the dataset consists of 13 subsets, each uniquely crafted to cater to different linguistic domains.
## The Conversational Subset:
This dataset has a collection of conversation-based content in Arabic.
## Other Subsets:
1. premio-ai/TheArabicPile
2. premio-ai/TheArabicPile_Web
3. premio-ai/TheArabicPile_Lyrics
4. premio-ai/TheArabicPile_Reviews
5. premio-ai/TheArabicPile_Dialects
6. premio-ai/TheArabicPile_Mathematics
7. premio-ai/TheArabicPile_Conversational
8. premio-ai/TheArabicPile_Articles
9. premio-ai/TheArabicPile_Poetry
10. premio-ai/TheArabicPile_Medical
11. premio-ai/TheArabicPile_Miscellaneous
12. premio-ai/TheArabicPile_SocialMedia
13. premio-ai/TheArabicPile_Translations
14. premio-ai/TheArabicPile_Books
These subsets serve distinct purposes, ranging from mathematical content to conversational dialogue, medical texts, and more. Notably, there's a dedicated subset, "premio-ai/TheArabicPile_SocialMedia," emphasizing the inclusion of language commonly found in social media contexts.
## Dataset Description
* Curated by: Premio.AI team
* Language(s) (NLP): Arabic, multiple languages on the translation dataset.
* License: CC BY-NC 4.0 Deed - Non Commercial.
* For any commercial uses or licensing, please contact mo@premio.ai.
## Data Structure
The datasets are divided into two main subsets:
1. Original Subset: The raw data as collected from sources, without modifications.
2. Deduplication Subset: A filtered and cleaned version, enhancing usability for large language models by reducing redundancy and noise.
The Arabic Pile extends an invitation not only for training and fine-tuning large language models but also for diverse applications across linguistic domains. Whether for research, analysis, or other linguistic endeavors, The Arabic Pile stands as a rich resource for the exploration of Arabic language intricacies.
## Data Collection
Please refer to the paper for more details on our data collection procedures.
## Data Format
The dataset has one single column called text. The text should contain the required meta data and the body combined. This was done to make sure that it will be a good fit for direct training or fine-tuning of large language models.
Please note that the meta data might require to be repeated if your training context window won’t fit the entire body of text.
## Potential Bias
As with any large-scale dataset, The Arabic Pile is not immune to potential biases that may influence the training and performance of language models. It's crucial to transparently address these biases to ensure responsible usage and interpretation of the dataset. Here are some considerations:
1. Dialectal Imbalance: The dataset incorporates various Arabic dialects, with a focus on Levantine, North African, and Egyptian variants. However, there might be variations in the representation of these dialects, potentially leading to an imbalance in the training data.
2. Source Influence: Bias may arise from the sources of the original data. The dataset collects information from diverse platforms and domains, and biases inherent in those sources could transfer to the dataset.
3. Social Media Context: Some of our datasets have language from social media platforms and online platforms. This subset may introduce biases inherent in online discourse, such as informal language, colloquial expressions, and potential subjectivity in politics, religion or culture.
4. Genre and Domain Bias: Different subsets cater to distinct linguistic domains, such as medical texts, poetry, reviews, and more. Each domain carries its own linguistic characteristics, potentially leading to biases based on the genres represented.
## License Information for The Arabic Pile: No Commercial Use
The Arabic Pile is released under the Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). This license is designed to facilitate the open sharing and collaboration of the dataset while ensuring responsible and non-commercial usage.
Key Points of the License:
* Attribution (BY): Users are free to share, adapt, and build upon the dataset, even commercially, as long as they provide appropriate attribution to the dataset creators.
* Non-Commercial (NC): The dataset may not be used for commercial purposes. Any use for commercial gain requires explicit permission from the dataset creators.
* No Additional Restrictions: The license allows for maximum freedom of use, provided the terms of attribution and non-commercial use are adhered to.
How to Cite: When using The Arabic Pile in your work, please include a proper citation to acknowledge the dataset creators. A recommended citation can be found in the model card for easy reference.
License Deed: For a comprehensive understanding of the terms and conditions, please refer to the CC BY-NC 4.0 License Deed.
By adopting this license, we aim to foster a collaborative and open environment for the exploration and advancement of Arabic language understanding and natural language processing.
## Citation
When utilizing The Arabic Pile in your research, development, or other projects, we kindly request that you cite the dataset using the following format:
@article{alrefaie2024arabicpile,
author = {Mohamed Taher Alrefaie, Mahmoud Ibrahim Barbary, Ahmed Yasser Hassanein, Shiref Khaled Elhalawany, Karim Ashraf Elsayed, Ahmed Yasser },
title = {The Arabic Pile: A Large Scale Dataset of Diverse Text for Large Language Modeling},
year = {2024},
url = {https://huggingface.co/datasets/premio-ai/TheArabicPile}
}
|
ilaria-oneofftech/ikitracs_mitigation | ---
dataset_info:
features:
- name: country_code
dtype: string
- name: country
dtype: string
- name: type_of_document
dtype: string
- name: version_number
dtype: string
- name: url
dtype: string
- name: paragraph
dtype: string
- name: lang
dtype: string
- name: parameter
dtype: string
- name: quote
dtype: string
- name: asi
dtype: string
- name: category
dtype: string
- name: high_level_category
dtype: string
- name: indicator
dtype: string
- name: paragraph_translated
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 48699276
num_examples: 82524
download_size: 16756391
dataset_size: 48699276
---
# Dataset Card for "ikitracs_mitigation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nojiyoon/pagoda-text-and-image-dataset-small | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4264403783.0
num_examples: 862
download_size: 4254098145
dataset_size: 4264403783.0
---
# Dataset Card for "pagoda-text-and-image-dataset-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChuckMcSneed/NeoEvalPlusN_benchmark | ---
license: wtfpl
tags:
- leaderboard
- benchmark
---
Since automatic open source benchmark leaderboard got flooded with incoherent overtrained cheater meme models, I decided to take the matters in my own hands and create my own set of proprietary tests. The aim of these tests is not to see how smart the model is, but to see how good it is at execution of commands and creative writing in a reasonably quantifiable way. All tests are executed with temperature and top P≈0 and rep. penalty=1 in koboldcpp. Model-appropriate format is used, unless it doesn't work.
Currently I have the following tests:
## B-test:
This test is designed to establish the baseline of the model. It consists of a main task and a bunch of text, which model has to ignore while still executing the task. If the model refuses or fails to comply in a logical way immediately, it fails(0/3). After the initial request question it will get bombarded with text, it gets 1 point for reaching the first checkpoint(1/3). It will get another point for passing the test fully(2/3) and a final point for exiting the test successfully(3/3)
## C-test:
Like B-test, but the task is simpler and the distracting text is way more annoying. Since the task is much simpler there are fewer points to gain. Model gets 1 point for passing main distractions and another point for successfully exiting the task. Model gets penalized for writing more than necessary, eg (Note: as an AI language model...).
## D-test:
This test is designed around breaking expectations. It consists of a common math trick, but with a twist. The twist is that there is no math involved, just reading. It also has an extensive section at the end to guide the model into breaking the overtrained conditioning. Models will get 1 point for getting the answer right and up to 2 points for the right reasoning.
## P-test:
Poems. Model passes each poem test for writing coherently and in rhyme. 1 point for each poem. 6 in total.
After seeing Miqu-120b succeed at positive writing and fail miserably at negative, I decided to revise the test a little bit by adjusting the ratios. Assume that all models prior and including Miqu-120b were run on old set, and newer ones will be run on the revised set.
## S-test:
Stylized writing. Models are asked to explain a concept in a distinct writing style or as if they are a character. Up to 1 point for each style. Models are penalized for failing to explain the concept or to keep the style all the way through the explaination. 8 in total. **Note:** not very reliable due to large human factor(±1). Take with a grain of salt.
# What does each of the tests measure I dont understand111!!!11!
BCD=following commands
PS=creative writing
# RESULTS

In the table above you can see the results visiualized. You can find pure data in file [LLM-test.csv](LLM-test.csv)
What they show is quite interesting:
- If a model can't pass any of the BCD tests, it is most likely braindead or very filtered(kinda same lol)
- If SP score of the model is very low it's writing style is dry
- Creative parent(Euryale) + creative parent(Xwin)=creative child(Goliath)
- Creative parent(Euryale) + dry parent(Nous-Hermes) + drier parent(SynthIA)=dry-ish child(Venus)
- Dry parent(Nous-Hermes) + creative parent(Xwin) + creative parent(Mythospice)=creative child(lzlv)
- Cheater meme model(una-cybertron) was somewhat creative, but braindead
- Base model self-merge(Dicephal-123B) increased creativity, but didn't add extra prompt compliance
- All my attempts to extend the context of XWin and Llama by using [Yukang's](https://huggingface.co/Yukang) loras have led to drastic decrease in creativity and coherence of the models :(
- Miqu is currently the best 32k model according to this benchmark
- Miqu-120b is the second model after ChatGPT that has 100% passed S-test!
# More tests?
Feel free to suggest more models for testing by opening new discussion. Mention model name, size and why do you want to test it.
# Limitations
- All tests were only done once.
- Human factor plays a huge role in SP tests. After redoing some of the tests I noticed ±1 variation for S-test and ±0.5 variation for P-test. (Xwin is likely underrated and Spicyboros is likely overrated in S-test.)
- Be critical of my own models! Since I have access to the benchmark, I can game it and rig it all I want and NOBODY can stop me.
# Can it be rigged/gamed?
Not sure. I've tried to game it by merging, but didn't succeed. You can check out my first attempt [here](https://huggingface.co/ChuckMcSneed/BenchmaxxxerPS-v1-123b).
If my questions somehow get leaked and the models are trained on them specifically, then definitely.
Update: I made [this RP model](https://huggingface.co/ChuckMcSneed/Gembo-v1-70b) while using this benchmark as a guideline for right/wrong merging. It has a ridiculously high score: 19.75/22! It's not bad, in fact, it is quite interesting in practice, but still far from ChatGPT(or maybe not, I haven't used in a while. Maybe they've lobotomized it to hell). |
yardeny/processed_t5_small_context_len_128 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 11400746112.0
num_examples: 17593744
download_size: 4372291284
dataset_size: 11400746112.0
---
# Dataset Card for "processed_t5_small_context_len_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
b2ktortechnik/productdata | ---
license: unknown
---
|
irds/mmarco_v2_id | ---
pretty_name: '`mmarco/v2/id`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/v2/id`
The `mmarco/v2/id` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/id).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`mmarco_v2_id_dev`](https://huggingface.co/datasets/irds/mmarco_v2_id_dev), [`mmarco_v2_id_train`](https://huggingface.co/datasets/irds/mmarco_v2_id_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/mmarco_v2_id', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
maghwa/OpenHermes-2-AR-10K-50-940k-950k | ---
dataset_info:
features:
- name: category
dtype: 'null'
- name: conversations
dtype: string
- name: custom_instruction
dtype: 'null'
- name: hash
sequence: int64
- name: language
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: avatarUrl
dtype: string
- name: model_name
dtype: 'null'
- name: idx
dtype: 'null'
- name: id
dtype: string
- name: title
dtype: string
- name: topic
dtype: 'null'
- name: model
dtype: string
- name: views
dtype: float64
- name: system_prompt
dtype: 'null'
- name: source
dtype: string
splits:
- name: train
num_bytes: 23317721
num_examples: 10001
download_size: 9348902
dataset_size: 23317721
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
brainer/Pill-Embeddings | ---
dataset_info:
features:
- name: embedding
sequence:
sequence:
sequence:
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 9628715200
num_examples: 20620
download_size: 428533143
dataset_size: 9628715200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.