datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ashwathjadhav23/Dutch_MLM_8 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 55307428
num_examples: 25000
download_size: 33148580
dataset_size: 55307428
---
# Dataset Card for "Dutch_MLM_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pushpdeep/UltraFeedback-paired | ---
license: mit
task_categories:
- text-generation
language:
- en
size_categories:
- 100K<n<1M
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
splits:
- name: train
num_bytes: 946257493
num_examples: 318777
download_size: 228559429
dataset_size: 946257493
---
# UltraFeedback Paired
This is a processed version of the [`openbmb/UltraFeedback`](https://huggingface.co/datasets/openbmb/UltraFeedback). The following steps were applied:
- Create pairs `(response_j, response_k)` where j was rated better than k based on `overall_score`
- Sample all 6 pairs for each instruction in the original data
This dataset is useful for LLM alignment techniques(like DPO). The processing steps are in [this repository](https://huggingface.co/datasets/pushpdeep/UltraFeedback-paired/blob/main/Ultrafeedback_paired_version.ipynb
). The code is based on [this repository](https://huggingface.co/datasets/lvwerra/stack-exchange-paired).
|
sergioq2/embedding_dt | ---
license: mit
---
|
ismailiismail/French_English_2 | ---
dataset_info:
features:
- name: french
dtype: string
- name: english
dtype: string
splits:
- name: train
num_bytes: 914954
num_examples: 2992
download_size: 352011
dataset_size: 914954
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "French_English_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_testdev_no_image_google_flan_t5_xxl_mode_T_A_D_PNP_FILTER_C_Q_rices_ns_107394 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_
num_bytes: 10135347
num_examples: 107394
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_Attributes_ViT_L_14_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_
num_bytes: 10140696
num_examples: 107394
download_size: 6149030
dataset_size: 20276043
---
# Dataset Card for "VQAv2_testdev_no_image_google_flan_t5_xxl_mode_T_A_D_PNP_FILTER_C_Q_rices_ns_107394"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AiresPucrs/german-credit-data | ---
license: cc0-1.0
dataset_info:
features:
- name: Age
dtype: int64
- name: Sex
dtype: string
- name: Job
dtype: int64
- name: Housing
dtype: string
- name: Saving accounts
dtype: string
- name: Checking account
dtype: string
- name: Credit amount
dtype: int64
- name: Duration
dtype: int64
- name: Purpose
dtype: string
- name: Risk
dtype: string
splits:
- name: train
num_bytes: 85728
num_examples: 1000
download_size: 13955
dataset_size: 85728
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
pretty_name: german-credit-data
size_categories:
- n<1K
---
# German Credit Data
## Overview
This dataset has information about 1000 individuals regarding their credit scores (bad = 0, good = 1),
all set as a binary classification problem.
## Dataset Details
The dataset is a smaller version of the original dataset. This data originally came from [Statlog (German Credit Data)](https://archive.ics.uci.edu/ml/datasets/statlog+(german+credit+data))
- Dataset Name: [German Credit Risk - With Target](https://www.kaggle.com/datasets/kabure/german-credit-data-with-risk))
- Language: English
- Total Size: 1000
## Contents
The dataset consists of a data frame with the following columns:
- Age [int64]
- Sex [string]
- Job [int64]
- Housing [string]
- Saving accounts [string]
- Checking account [string]
- Credit amount [int64]
- Duration [int64]
- Purpose [string]
- Risk [string]
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/german-credit-data", split = 'train')
```
## License
The German Credit Risk dataset is licensed under the [Creative Commons](https://creativecommons.org/publicdomain/zero/1.0/) License CC0 1.0 Universal. |
aimona/eng-conversations_no-tokenizer_no-time | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 384592697
num_examples: 30052
download_size: 177374362
dataset_size: 384592697
---
# Dataset Card for "eng-conversations_no-tokenizer_no-time"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sleeplesslad/yuh | ---
license: openrail
---
|
Capsekai/80sCartoons | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- text
- text to image
- stable diffusion
- 80s
pretty_name: Eighties Cartoons
size_categories:
- 1K<n<10K
---
# Do not resell the data, you don't own the data but you do your own outputs of your training. See main lisc for details |
coref-data/gum_raw | ---
license: other
---
# GUM Corpus V9.2.0
- Project: https://github.com/amir-zeldes/gum
- Data source: https://github.com/amir-zeldes/gum/commit/3b0ab7d11911be1695e4dacadb28a7a1df230bdb
## Details
An English corpus annotated for coreference and other linguistic phenomenon. See the project repo for full corpora license information. Annotations are licensed under CC-BY-4.0.
## Citation
```
@Article{Zeldes2017,
author = {Amir Zeldes},
title = {The {GUM} Corpus: Creating Multilayer Resources in the Classroom},
journal = {Language Resources and Evaluation},
year = {2017},
volume = {51},
number = {3},
pages = {581--612},
doi = {http://dx.doi.org/10.1007/s10579-016-9343-x}
}
@InProceedings{ZhuEtAl2021,
author = {Yilun Zhu and Sameer Pradhan and Amir Zeldes},
booktitle = {Proceedings of ACL-IJCNLP 2021},
title = {{OntoGUM}: Evaluating Contextualized {SOTA} Coreference Resolution on 12 More Genres},
year = {2021},
pages = {461--467},
address = {Bangkok, Thailand}
}
```
## Features
```python
{'coref_entities': [[{'eid': Value(dtype='string', id=None),
'eid_or_grp': Value(dtype='string', id=None),
'etype': Value(dtype='string', id=None),
'other': Value(dtype='string', id=None),
'sent_id': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None)}]],
'doc_id': Value(dtype='string', id=None),
'ontogum_coref_chains': Sequence(feature=Sequence(feature=Sequence(feature=Value(dtype='int64',
id=None),
length=-1,
id=None),
length=-1,
id=None),
length=-1,
id=None),
'ontogum_sentences': [[{'deprel': Value(dtype='string', id=None),
'deps': Value(dtype='string', id=None),
'feats': Value(dtype='string', id=None),
'head': Value(dtype='int64', id=None),
'id': Value(dtype='int64', id=None),
'lemma': Value(dtype='string', id=None),
'misc': Value(dtype='string', id=None),
'text': Value(dtype='string', id=None),
'upos': Value(dtype='string', id=None),
'xpos': Value(dtype='string', id=None)}]],
'sentences': [{'comment': Value(dtype='string', id=None),
'conll_rows': [{'deprel': Value(dtype='string', id=None),
'deps': Value(dtype='string', id=None),
'feats': Value(dtype='string', id=None),
'head': Value(dtype='int64', id=None),
'id': Value(dtype='int64', id=None),
'lemma': Value(dtype='string', id=None),
'misc': Value(dtype='string', id=None),
'text': Value(dtype='string', id=None),
'upos': Value(dtype='string', id=None),
'xpos': Value(dtype='string', id=None)}],
'global_entity': Value(dtype='string', id=None),
'newdoc': Value(dtype='string', id=None),
'newpar': Value(dtype='bool', id=None),
'sent_id': Value(dtype='string', id=None),
'speaker': Value(dtype='string', id=None),
'text': Value(dtype='string', id=None),
'tokens': [{'coref_mentions': [{'eid': Value(dtype='string',
id=None),
'eid_or_grp': Value(dtype='string',
id=None),
'etype': Value(dtype='string',
id=None),
'other': {'centering': Value(dtype='string',
id=None),
'identity': Value(dtype='string',
id=None),
'infstat': Value(dtype='string',
id=None),
'link': Value(dtype='string',
id=None),
'minspan': Value(dtype='string',
id=None)},
'span': Value(dtype='string',
id=None)}],
'deprel': Value(dtype='string', id=None),
'feats': Value(dtype='string', id=None),
'form': Value(dtype='string', id=None),
'head': Value(dtype='int64', id=None),
'lemma': Value(dtype='string', id=None),
'misc': Value(dtype='string', id=None),
'ord': Value(dtype='float64', id=None),
'upos': Value(dtype='string', id=None),
'xpos': Value(dtype='string', id=None)}]}]}
``` |
arieg/cluster00_large_100 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '000212'
'1': 003708
'2': '005171'
'3': 009557
'4': 009559
'5': 009678
'6': 010384
'7': 010386
'8': 010807
'9': '013325'
'10': '014735'
'11': 014739
'12': 019187
'13': '023041'
'14': 024915
'15': '036614'
'16': 039188
'17': '040242'
'18': '040243'
'19': 040985
'20': 045128
'21': '051271'
'22': '054667'
'23': '054703'
'24': 059451
'25': '062164'
'26': '067007'
'27': '067237'
'28': '067357'
'29': '067557'
'30': 072738
'31': '073465'
'32': 073468
'33': 074391
'34': 075925
'35': 080003
'36': 085482
'37': 085484
'38': 085485
'39': 085489
'40': 087190
'41': 087363
'42': 088854
'43': 095249
'44': 095251
'45': 098622
'46': 099411
'47': '106458'
'48': '107617'
'49': '107909'
'50': '108477'
'51': '108881'
'52': '109203'
'53': '109355'
'54': '109903'
'55': '113511'
'56': '113973'
'57': '114199'
'58': '114413'
'59': '117627'
'60': '118087'
'61': '118195'
'62': '118222'
'63': '118738'
'64': '118986'
'65': '122079'
'66': '122354'
'67': '122395'
'68': '122628'
'69': '123438'
'70': '123474'
'71': '123505'
'72': '125187'
'73': '125194'
'74': '125723'
'75': '126669'
'76': '126674'
'77': '126743'
'78': '126749'
'79': '127184'
'80': '127205'
'81': '127273'
'82': '127275'
'83': '127298'
'84': '127300'
'85': '129694'
'86': '130940'
'87': '130945'
'88': '131292'
'89': '132272'
'90': '133793'
'91': '136094'
'92': '137719'
'93': '138016'
'94': '138210'
'95': '138282'
'96': '138406'
'97': '138415'
'98': '141179'
'99': '143095'
'100': '145241'
'101': '146988'
'102': '148285'
'103': '148585'
'104': '149143'
splits:
- name: train
num_bytes: 569752596.5
num_examples: 10500
download_size: 563671563
dataset_size: 569752596.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/general_liu_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of general_liu/劉氏歩槍/刘氏步枪 (Girls' Frontline)
This is the dataset of general_liu/劉氏歩槍/刘氏步枪 (Girls' Frontline), containing 41 images and their tags.
The core tags of this character are `black_hair, long_hair, breasts, braid, large_breasts, very_long_hair, bangs, black_eyes, hat, mole, military_hat, mole_under_eye`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 45.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/general_liu_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 41 | 28.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/general_liu_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 95 | 53.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/general_liu_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 41 | 41.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/general_liu_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 95 | 73.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/general_liu_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/general_liu_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_gloves, blush, boots, holding_gun, rifle, black_footwear, closed_mouth, full_body, single_braid, smile, thigh_strap, military_uniform, thighs, belt, bow, cleavage, earrings, official_alternate_costume, simple_background, sword, torn_clothes, white_background, white_headwear |
| 1 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, military_uniform, solo, blush, simple_background, peaked_cap, smile, white_background, cleavage, closed_mouth, cowboy_shot, jacket, long_sleeves, open_mouth, thigh_strap, white_gloves |
| 2 | 8 |  |  |  |  |  | 1girl, smile, china_dress, fox_ears, fox_tail, looking_at_viewer, nail_polish, solo, white_dress, hair_flower, blush, multiple_tails, red_nails, sitting, thigh_strap, white_rose, bare_legs, barefoot, bead_bracelet, cleavage_cutout, closed_mouth, feet, fingernails, full_body, hair_over_one_eye, leg_ribbon, necklace, official_alternate_costume, oil-paper_umbrella, simple_background, toe_ring, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_gloves | blush | boots | holding_gun | rifle | black_footwear | closed_mouth | full_body | single_braid | smile | thigh_strap | military_uniform | thighs | belt | bow | cleavage | earrings | official_alternate_costume | simple_background | sword | torn_clothes | white_background | white_headwear | peaked_cap | cowboy_shot | jacket | long_sleeves | open_mouth | china_dress | fox_ears | fox_tail | nail_polish | white_dress | hair_flower | multiple_tails | red_nails | sitting | white_rose | bare_legs | barefoot | bead_bracelet | cleavage_cutout | feet | fingernails | hair_over_one_eye | leg_ribbon | necklace | oil-paper_umbrella | toe_ring |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------|:--------|:--------------|:--------|:-----------------|:---------------|:------------|:---------------|:--------|:--------------|:-------------------|:---------|:-------|:------|:-----------|:-----------|:-----------------------------|:--------------------|:--------|:---------------|:-------------------|:-----------------|:-------------|:--------------|:---------|:---------------|:-------------|:--------------|:-----------|:-----------|:--------------|:--------------|:--------------|:-----------------|:------------|:----------|:-------------|:------------|:-----------|:----------------|:------------------|:-------|:--------------|:--------------------|:-------------|:-----------|:---------------------|:-----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | | | | X | | | X | X | X | | | | X | | | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | X | | | | | X | X | | X | X | | | | | | | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
flax-community/conceptual-captions-12 | ---
language:
- en
---
This file contains English captions from Conceptual 12M dataset by Google. Since we don't own the images, we have provided the link to images, name of downloaded file, and caption for that image in the TSV file.
We would like to thank [Luke Melas](https://github.com/lukemelas) for helping us get the cleaned CC-12M data on our TPU-VMs. |
akkasi/EnglishNLPDataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: float64
- name: label2idx
dtype: string
- name: idx2label
dtype: string
splits:
- name: train
num_bytes: 16432106
num_examples: 80616
- name: validation
num_bytes: 2421791
num_examples: 10000
- name: test
num_bytes: 2456653
num_examples: 10000
download_size: 5458653
dataset_size: 21310550
---
# Dataset Card for "EnglishNLPDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johnowhitaker/Pseudagrilus | ---
license: mit
---
|
thaizenn/julesjordan | ---
language:
- en
--- |
mikegarts/oa_tell_a_joke_10000 | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
- name: METADATA
struct:
- name: link
dtype: string
- name: nsfw
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6108828
num_examples: 10000
download_size: 3247379
dataset_size: 6108828
---
# Dataset Card for "oa_tell_a_joke_10000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlipe/playing_cards_2 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1033364793.36
num_examples: 1155
download_size: 1018166630
dataset_size: 1033364793.36
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
peldrak/riviera_labeled_test | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 298529487.0
num_examples: 231
download_size: 67692943
dataset_size: 298529487.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
humosleo/project-sutdion | ---
license: unlicense
---
|
LoonyZX/Roxanne | ---
license: openrail
---
|
sade-adrien/redpajama_v2_sample_1M | ---
dataset_info:
features:
- name: raw_content
dtype: string
- name: doc_id
dtype: string
- name: meta
dtype: string
- name: quality_signals
dtype: string
splits:
- name: train
num_bytes: 10406779209
num_examples: 1000000
download_size: 4624261556
dataset_size: 10406779209
---
# Dataset Card for "redpajama_v2_sample_1M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrieuNguyen/chest-xray | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': NORMAL
'1': PNEUMONIA
splits:
- name: train
num_bytes: 3186635036.504
num_examples: 5216
- name: validation
num_bytes: 3030633.0
num_examples: 16
- name: test
num_bytes: 79062317.0
num_examples: 624
download_size: 1230487052
dataset_size: 3268727986.504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
joey234/mmlu-astronomy-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 46473
num_examples: 152
download_size: 28019
dataset_size: 46473
---
# Dataset Card for "mmlu-astronomy-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liataynat/Yoimiya3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
- name: metadata
struct:
- name: file_path
dtype: string
- name: repo_id
dtype: string
- name: token_count
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 250208146
num_examples: 23068
download_size: 75899781
dataset_size: 250208146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16 | ---
pretty_name: Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-31T19:21:09.032023](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A21%3A09.032023.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24079112101610886,\n\
\ \"acc_stderr\": 0.030961801782247226,\n \"acc_norm\": 0.24208994950215265,\n\
\ \"acc_norm_stderr\": 0.03097894827141845,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931588,\n \"mc2\": 0.4774590793334822,\n\
\ \"mc2_stderr\": 0.01691343346185639\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2175767918088737,\n \"acc_stderr\": 0.0120572620209725,\n\
\ \"acc_norm\": 0.26791808873720135,\n \"acc_norm_stderr\": 0.012942030195136426\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26926906990639315,\n\
\ \"acc_stderr\": 0.004426734718808876,\n \"acc_norm\": 0.29555865365465045,\n\
\ \"acc_norm_stderr\": 0.004553609405747228\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108608,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n\
\ \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n \
\ \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823778,\n \"\
acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823778\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25161290322580643,\n \"acc_stderr\": 0.02468597928623997,\n \"\
acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.02468597928623997\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586804,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586804\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775296,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936087,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936087\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n\
\ \"acc_stderr\": 0.03058759135160425,\n \"acc_norm\": 0.2549019607843137,\n\
\ \"acc_norm_stderr\": 0.03058759135160425\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395592,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395592\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n\
\ \"acc_stderr\": 0.01586624307321506,\n \"acc_norm\": 0.26947637292464877,\n\
\ \"acc_norm_stderr\": 0.01586624307321506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500114,\n\
\ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.024847921358063962,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.024847921358063962\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142314,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20408163265306123,\n\
\ \"acc_stderr\": 0.02580128347509051,\n \"acc_norm\": 0.20408163265306123,\n\
\ \"acc_norm_stderr\": 0.02580128347509051\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n\
\ \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931588,\n\
\ \"mc2\": 0.4774590793334822,\n \"mc2_stderr\": 0.01691343346185639\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|arc:challenge|25_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hellaswag|10_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T19:21:09.032023.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T19:21:09.032023.parquet'
- config_name: results
data_files:
- split: 2023_07_31T19_21_09.032023
path:
- results_2023-07-31T19:21:09.032023.parquet
- split: latest
path:
- results_2023-07-31T19:21:09.032023.parquet
---
# Dataset Card for Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-31T19:21:09.032023](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A21%3A09.032023.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24079112101610886,
"acc_stderr": 0.030961801782247226,
"acc_norm": 0.24208994950215265,
"acc_norm_stderr": 0.03097894827141845,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931588,
"mc2": 0.4774590793334822,
"mc2_stderr": 0.01691343346185639
},
"harness|arc:challenge|25": {
"acc": 0.2175767918088737,
"acc_stderr": 0.0120572620209725,
"acc_norm": 0.26791808873720135,
"acc_norm_stderr": 0.012942030195136426
},
"harness|hellaswag|10": {
"acc": 0.26926906990639315,
"acc_stderr": 0.004426734718808876,
"acc_norm": 0.29555865365465045,
"acc_norm_stderr": 0.004553609405747228
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108608,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.021591269407823778,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.021591269407823778
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.02468597928623997,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.02468597928623997
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936087,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936087
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.01586624307321506,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.01586624307321506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500114,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.024847921358063962,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.024847921358063962
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142314,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.02580128347509051,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.02580128347509051
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931588,
"mc2": 0.4774590793334822,
"mc2_stderr": 0.01691343346185639
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
c01dsnap/CIC-IDS2017 | ---
license: other
license_name: other
license_link: LICENSE
---
The CICIDS2017 dataset consists of labeled network flows, including full packet payloads in pcap format, the corresponding profiles and the labeled flows (GeneratedLabelledFlows.zip) and CSV files for machine and deep learning purpose (MachineLearningCSV.zip) are publicly available for researchers. If you are using our dataset, you should cite our related paper which outlining the details of the dataset and its underlying principles:
* Iman Sharafaldin, Arash Habibi Lashkari, and Ali A. Ghorbani, “Toward Generating a New Intrusion Detection Dataset and Intrusion Traffic Characterization”, 4th International Conference on Information Systems Security and Privacy (ICISSP), Portugal, January 2018 |
jordandavis/color_squares | ---
dataset_info:
features:
- name: color
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 12263.0
num_examples: 41
download_size: 5215
dataset_size: 12263.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
therem/dpo_dataset | ---
dataset_info:
- config_name: default
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 4278828
num_examples: 2889
- name: test
num_bytes: 1074941
num_examples: 723
download_size: 1477190
dataset_size: 5353769
- config_name: main
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 4278828
num_examples: 2889
- name: test
num_bytes: 1074941
num_examples: 723
download_size: 1477190
dataset_size: 5353769
- config_name: prompt_eval
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 6342
num_examples: 49
download_size: 8032
dataset_size: 6342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- config_name: main
data_files:
- split: train
path: main/train-*
- split: test
path: main/test-*
- config_name: prompt_eval
data_files:
- split: train
path: prompt_eval/train-*
---
|
CyberHarem/lunaru_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lunaru/ルナール (Granblue Fantasy)
This is the dataset of lunaru/ルナール (Granblue Fantasy), containing 56 images and their tags.
The core tags of this character are `long_hair, pointy_ears, eyepatch, black_hair, hat, bangs, blunt_bangs, medical_eyepatch, blue_eyes, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 56 | 47.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 56 | 32.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 120 | 68.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 56 | 44.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 120 | 87.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lunaru_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lunaru_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, long_sleeves, solo, harvin, looking_at_viewer, blush, black_dress, black_headwear, frills, holding, gothic_lolita, white_shirt, closed_mouth, purple_hair, belt, brooch, striped_bow, white_background, wide_sleeves, black_footwear, puffy_sleeves, shoes, simple_background, smile, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | harvin | looking_at_viewer | blush | black_dress | black_headwear | frills | holding | gothic_lolita | white_shirt | closed_mouth | purple_hair | belt | brooch | striped_bow | white_background | wide_sleeves | black_footwear | puffy_sleeves | shoes | simple_background | smile | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------|:--------------------|:--------|:--------------|:-----------------|:---------|:----------|:----------------|:--------------|:---------------|:--------------|:-------|:---------|:--------------|:-------------------|:---------------|:-----------------|:----------------|:--------|:--------------------|:--------|:-----------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
heliosprime/twitter_dataset_1712977471 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11948
num_examples: 26
download_size: 10140
dataset_size: 11948
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712977471"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Bikesuffer/cali_lp20 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 735059.0
num_examples: 20
download_size: 726566
dataset_size: 735059.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
renzr/iceskaters | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 350984415.0
num_examples: 43
download_size: 19889956
dataset_size: 350984415.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
khoomeik/gzipscale-0.61-100M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 440385990
num_examples: 390625
download_size: 278142814
dataset_size: 440385990
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sharren/originalSkin | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': akiec
'1': bcc
'2': bkl
'3': df
'4': mel
'5': nv
'6': vasc
splits:
- name: train
num_bytes: 1395929276.984
num_examples: 5128
- name: validation
num_bytes: 802343124.612
num_examples: 2884
- name: test
num_bytes: 554017223.564
num_examples: 2003
download_size: 2770507239
dataset_size: 2752289625.16
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
KaiLv/UDR_Yelp | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: label
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 22696875
num_examples: 30000
- name: test
num_bytes: 2261177
num_examples: 3000
- name: debug
num_bytes: 3745338
num_examples: 5000
download_size: 18407788
dataset_size: 28703390
---
# Dataset Card for "UDR_Yelp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FreedomIntelligence/MMLU_Japanese | ---
license: mit
language:
- ja
---
Japanese version of MMLU dataset tranlasted by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
CyberHarem/hagikaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hagikaze/萩風 (Kantai Collection)
This is the dataset of hagikaze/萩風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, purple_hair, ahoge, one_side_up, brown_eyes, breasts, ribbon, red_ribbon, neck_ribbon, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 464.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hagikaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 312.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hagikaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1166 | 657.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hagikaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 427.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hagikaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1166 | 846.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hagikaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hagikaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, black_skirt, black_vest, pleated_skirt, short_sleeves, solo, white_gloves, white_shirt, simple_background, white_background, blouse, looking_at_viewer, school_uniform, twitter_username, smile, cowboy_shot, one-hour_drawing_challenge, blush, open_mouth |
| 1 | 5 |  |  |  |  |  | 1girl, black_skirt, black_vest, blouse, full_body, grey_socks, kneehighs, pleated_skirt, short_sleeves, simple_background, solo, white_background, white_gloves, white_shirt, school_uniform, black_footwear, chibi, loafers, looking_at_viewer, smile, standing |
| 2 | 6 |  |  |  |  |  | 1girl, black_skirt, black_vest, blouse, pleated_skirt, purple_panties, short_sleeves, solo, white_shirt, white_gloves, lifted_by_self, skirt_lift, dress_shirt, sitting |
| 3 | 9 |  |  |  |  |  | 1girl, black_vest, blouse, gradient_background, short_sleeves, solo, upper_body, white_gloves, white_shirt, looking_at_viewer, open_mouth, dated, one-hour_drawing_challenge, red_background, blush, school_uniform |
| 4 | 12 |  |  |  |  |  | 1girl, navel, simple_background, solo, underwear_only, blush, cleavage, collarbone, looking_at_viewer, white_background, purple_panties, twitter_username, purple_bra, cowboy_shot, large_breasts, open_mouth, one-hour_drawing_challenge, smile, bare_shoulders, groin, white_gloves, yellow_eyes |
| 5 | 5 |  |  |  |  |  | 1girl, blush, collarbone, large_breasts, looking_at_viewer, solo, cleavage, smile, dated, purple_bikini, purple_eyes, alternate_costume, cowboy_shot, navel, one-hour_drawing_challenge, simple_background, twitter_username, white_background |
| 6 | 11 |  |  |  |  |  | 1girl, black_dress, simple_background, solo, enmaided, looking_at_viewer, white_apron, blush, maid_apron, frilled_apron, white_background, maid_headdress, open_mouth, short_sleeves, smile, white_gloves, cleavage, hair_between_eyes, one-hour_drawing_challenge, thighhighs, twitter_username |
| 7 | 9 |  |  |  |  |  | detached_collar, playboy_bunny, rabbit_ears, 1girl, fake_animal_ears, solo, looking_at_viewer, strapless_leotard, wrist_cuffs, blush, cleavage, open_mouth, smile, alternate_costume, black_leotard, bowtie, fishnet_pantyhose, large_breasts, simple_background, white_background, bare_shoulders, covered_navel, rabbit_tail |
| 8 | 7 |  |  |  |  |  | 1girl, alternate_costume, obi, solo, yukata, looking_at_viewer, smile, alternate_hairstyle, blue_kimono, ponytail, simple_background, white_background, hairclip, open_mouth, bag, blush, dated, full_body, long_sleeves, sandals, twitter_username, wide_sleeves, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | black_vest | pleated_skirt | short_sleeves | solo | white_gloves | white_shirt | simple_background | white_background | blouse | looking_at_viewer | school_uniform | twitter_username | smile | cowboy_shot | one-hour_drawing_challenge | blush | open_mouth | full_body | grey_socks | kneehighs | black_footwear | chibi | loafers | standing | purple_panties | lifted_by_self | skirt_lift | dress_shirt | sitting | gradient_background | upper_body | dated | red_background | navel | underwear_only | cleavage | collarbone | purple_bra | large_breasts | bare_shoulders | groin | yellow_eyes | purple_bikini | purple_eyes | alternate_costume | black_dress | enmaided | white_apron | maid_apron | frilled_apron | maid_headdress | hair_between_eyes | thighhighs | detached_collar | playboy_bunny | rabbit_ears | fake_animal_ears | strapless_leotard | wrist_cuffs | black_leotard | bowtie | fishnet_pantyhose | covered_navel | rabbit_tail | obi | yukata | alternate_hairstyle | blue_kimono | ponytail | hairclip | bag | long_sleeves | sandals | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------------|:----------------|:----------------|:-------|:---------------|:--------------|:--------------------|:-------------------|:---------|:--------------------|:-----------------|:-------------------|:--------|:--------------|:-----------------------------|:--------|:-------------|:------------|:-------------|:------------|:-----------------|:--------|:----------|:-----------|:-----------------|:-----------------|:-------------|:--------------|:----------|:----------------------|:-------------|:--------|:-----------------|:--------|:-----------------|:-----------|:-------------|:-------------|:----------------|:-----------------|:--------|:--------------|:----------------|:--------------|:--------------------|:--------------|:-----------|:--------------|:-------------|:----------------|:-----------------|:--------------------|:-------------|:------------------|:----------------|:--------------|:-------------------|:--------------------|:--------------|:----------------|:---------|:--------------------|:----------------|:--------------|:------|:---------|:----------------------|:--------------|:-----------|:-----------|:------|:---------------|:----------|:---------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | | X | X | X | X | | | X | X | X | | | | X | X | X | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | | | | | X | X | | X | X | | X | | X | X | X | X | X | X | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | X | | | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | X | | X | | X | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | | | | X | X | X | | X | X | | X | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | | | | X | | | X | X | | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | X | | | X | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | | | X | | | X | X | | X | | X | X | | | X | X | X | | | | | | | | | | | | | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
PerceptionEval/Correspondence | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: question
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: choices
sequence: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: val
num_bytes: 148356388.0
num_examples: 172
- name: test
num_bytes: 142006440.0
num_examples: 172
download_size: 288870472
dataset_size: 290362828.0
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
frenchpierre/firstdelosdataset | ---
license: apache-2.0
--- |
RyokoExtra/SuperWIKI-Cleaned | ---
license: cc-by-sa-3.0
language:
- en
task_categories:
- text-generation
- fill-mask
tags:
- language-modeling
- masked-language-modeling
pretty_name: SuperWIKI Cleaned
configs:
- config_name: default
default: true
data_files:
- split: lang50NightShade
path:
- "*-lang50NightShade-*.json.gz"
- split: lang50
path:
- "*-lang50-*.json.gz"
- split: lang25
path:
- "*-lang25-*.json.gz"
---
# Dataset Card for SuperWIKI Cleaned
## Dataset Description
- **Homepage:** (TODO)
- **Repository:** N/A
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** KaraKaraWitch
### Dataset Summary
> If you show most of those to people and ask them to form an opinion,
> the answer isn't just going to be "I don't know": it'll be "I don't care."
> - [Tom Scott](https://www.youtube.com/watch?v=ALy6e7GbDRQ&t=90s)
>
SuperWIKI Cleaned is a focused dataset on wikipedia articles.
This dataset is derived from raw files provided in [SuperWIKI](https://huggingface.co/datasets/RyokoExtra/SuperWIKI).
### Supported Tasks and Leaderboards
The dataset is generally used for Language Modeling.
### Languages
- English
## Dataset Structure
All the files are located in gzip'd jsonl files.
### Data Instances
Refer to this sample to see all the fields:
```json
{
"id": 35507,
"text": "In computer network communications, the **HTTP 404**, **404 not found**, **404**, **404 error**, **page not found** or **file not found** error message is a hypertext transfer protocol (HTTP) standard response code, to indicate that the browser was able to communicate with a given server, but the server could not find what was requested. The error may also be used when a server does not wish to disclose whether it has the requested information.<TRUNCATED>",
"title": "HTTP 404",
"url": "https://en.wikipedia.org/wiki/HTTP_404",
"filters": {
"issues": [],
"selectors": [],
"templates": [
"template:http",
"template:redirect",
"template:use dmy dates",
"template:cite book",
"template:portal",
"template:anchor",
"template:pp-move-indef",
"template:cite news",
"template:reflist",
"template:short description",
"template:citation",
"template:error messages",
"template:pp-semi-indef",
"template:cite journal",
"template:cite web"
],
"rituals": []
},
"infobox_html": [],
"figures_dict": [
{
"file_url": "./File:Wikipedia_404_Page.png",
"caption": "English Wikipedia's 404 Page"
},
{
"file_url": "./File:Wikimedia_error_404.png",
"caption": "The Wikimedia 404 message"
}
]
}
```
### Data Fields
`id`: The article ID in question
`text`: The HTML Text (After post-processing) from SuperWIKI converted to markdown with links removed and formatting (Bold, italics) kept.
`title`: The title of the wikipedia article.
`url`: The URL of the article.
`filters`: Metadata of filters found/used in the dataset.
- `issues`: A list of custom list of templates that has been removed from the html (ala, pre-processing) for the article.
- `selectors`: `issues` are based on templates, which may have multiple templates but mean the same thing. In that case, the selectors provide a unduplicated css class selectors that were used for the article. (`Template:Few sources` is the same as `Template:More citations needed` for example.)
- `rituals`: List of "Rituals" used to remove even more "Issue" templates. If not present, this field is empty.
- `templates`: Used for debugging but are all the templates found in the article.
`infobox_html`: A list of side infoboxes that ae extracted out from the text.
`figures_dict`: A list of figures used in the article. Again, extracted out from the text.
#### Q-Score Distribution
Not Applicable
### Data Splits
No data splits were done.
## Dataset Creation
### Curation Rationale
"Wikipedia is a wonderful resources however it could be considered too sparse as there are many articles that are not important for the common user..."
> The abundance of less significant or obscure topics can also contribute to the perceived sparsity. While Wikipedia's commitment to covering even niche subjects is commendable, it might be overwhelming for casual users seeking concise and essential information. For instance, niche historical events, minor fictional characters, or obscure scientific theories might exist as standalone articles, but their relevance to the everyday reader could be questioned. - *ChatGPT*
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
This article removes all "Notices" from all articles to provide a cleaner version of wikipedia.
You should consider adding flags back into the dataset if you want to tell the user about the potential issues.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
KaraKaraWitch
### Licensing Information
Most of Wikipedia's text and many of its images are co-licensed under the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)
(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)
(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such
text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes
the text.
### Citation Information
```
@misc{superwiki,
title = {SuperWIKI Cleaned: Wikipedia for commoners.},
author = {KaraKaraWitch},
year = {2023},
howpublished = {\url{https://huggingface.co/datasets/RyokoExtra/SuperWIKI}},
}
```
### Name Etymology
N/A
### Contributions
- [@KaraKaraWitch (Twitter)](https://twitter.com/KaraKaraWitch) for gathering this dataset.
- [@sirneggles (Twitter)](https://twitter.com/sirneggles) for provided compute. |
erhwenkuo/openorca-chinese-zhtw | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 6491661288
num_examples: 4233915
download_size: 4106469779
dataset_size: 6491661288
language:
- zh
license: mit
task_categories:
- conversational
- text-classification
- token-classification
- table-question-answering
- question-answering
- zero-shot-classification
- summarization
- feature-extraction
- text-generation
- text2text-generation
pretty_name: ' openorca-chinese-zhtw'
size_categories:
- 10M<n<100M
---
## Table of Contents
- [Dataset Summary](#dataset-summary)
- [Dataset Attribution](#dataset-attribution)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Dataset Use](#dataset-use)
- [Use Cases](#use-cases)
- [Usage Caveats](#usage-caveats)
- [Getting Started](#getting-started)
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "openorca-chinese-zhtw"
<a name="dataset-summary"></a>
# Dataset Summary
The OpenOrca dataset is a collection of augmented [FLAN Collection data](https://arxiv.org/abs/2301.13688).
Currently ~1M GPT-4 completions, and ~3.2M GPT-3.5 completions.
It is tabularized in alignment with the distributions presented in the ORCA paper and currently represents a partial completion of the full intended dataset, with ongoing generation to expand its scope.
The data is primarily used for training and evaluation in the field of natural language processing.
<a name="supported-tasks-and-leaderboards"></a>
# Supported Tasks and Leaderboards
This dataset supports a range of tasks including language modeling, text generation, and text augmentation.
It has been instrumental in the generation of multiple high-performing model checkpoints which have exhibited exceptional performance in our unit testing.
Further information on leaderboards will be updated as they become available.
<a name="languages"></a>
# Languages
The language of the origin data is primarily English and this dataset is translated by Google Translation to traditional Chinese.
<a name="dataset-structure"></a>
# Dataset Structure
<a name="data-instances"></a>
## Data Instances
A data instance in this dataset represents entries from the FLAN collection which have been augmented by submitting the listed question to either GPT-4 or GPT-3.5.
The response is then entered into the response field.
<a name="data-fields"></a>
## Data Fields
The fields are:
1) 'id', a unique numbered identifier which includes one of 'niv', 't0', 'cot', or 'flan' to represent which source FLAN Collection submix the 'question' is sourced from.
2) 'system_prompt', representing the System Prompt presented to the GPT-3.5 or GPT-4 API for the datapoint
3) 'question', representing a question entry as provided by the FLAN Collection
4) 'response', a response to that question received from a query to either GPT-3.5 or GPT-4.
<a name="data-splits"></a>
## Data Splits
The data is unsplit.
<a name="dataset-creation"></a>
# Dataset Creation
<a name="curation-rationale"></a>
## Curation Rationale
The dataset was created to provide a source of augmented text data for researchers and developers.
The datapoints are intended primarily to provide an enhancement of the core FLAN Collection data which relies upon the detailed step by step reasoning capabilities of GPT-3.5 and GPT-4.
This "reasoning trace" augmentation has demonstrated exceptional results, allowing a LLaMA-13B model trained with this data to rival or beat GPT-3.5 on broad sets of hard reasoning tasks which all models below 100B parameters had previously performed dramatically worse on.
<a name="source-data"></a>
## Source Data
The data is generated using techniques in alignment with the distributions outlined in the Orca paper, except as noted below:
1) There is not enough CoT data in the FLAN Collection to generate 150K zero-shot entries, as the paper purports to use.
We suspect this portion was either undocumented or misrepresented. We have used the ~75K points available.
2) We used the pre-generated FLAN Collection datasets hosted on HuggingFace under conceptofmind, e.g. [conceptofmind/flan2021](https://huggingface.co/datasets/conceptofmind/flan2021_submix_original).
These are referenced by the [official FLAN Collection repo](https://github.com/google-research/FLAN/tree/main/flan/v2) as the preferred data source.
However, these are a subset of the full FLAN Collection data, and have less than the required entries for the flan2021 and t0 submixes, by ~1.25M and 200k respectively.
Combined, this gave us ~1.5M fewer datapoints than in the original Orca paper. Completing the set is an ongoing work.
<a name="dataset-use"></a>
# Dataset Use
<a name="use-cases"></a>
## Use Cases
The dataset can be used for tasks related to language understanding, natural language processing, machine learning model training, and model performance evaluation.
<a name="usage-caveats"></a>
## Usage Caveats
Given that this is a work-in-progress dataset, it is recommended to regularly check for updates and improvements.
Further, the data should be used in accordance with the guidelines and recommendations outlined in the Orca paper.
<a name="getting-started"></a>
## Getting Started
This dataset is organized such that it can be naively loaded via Hugging Face datasets library.
We recommend using streaming due to the large size of the files.
Regular updates and data generation progress can be monitored through the OpenOrca repository on Hugging Face.
# Citation
```bibtex
@misc{OpenOrca,
title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces},
author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca},
}
```
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
```bibtex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint= arXiv 2307.09288
}
@software{touvron2023llama,
title={LLaMA: Open and Efficient Foundation Language Models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
``` |
pgajo/EW-TT-PE_en-it_spaced | ---
dataset_info:
features:
- name: data
struct:
- name: text_en
dtype: string
- name: text_it
dtype: string
- name: predictions
list:
- name: model_version
dtype: string
- name: result
list:
- name: from_name
dtype: string
- name: to_name
dtype: string
- name: type
dtype: string
- name: value
struct:
- name: end
dtype: int64
- name: labels
sequence: string
- name: start
dtype: int64
splits:
- name: train
num_bytes: 1833192.0
num_examples: 630
- name: test
num_bytes: 203688.0
num_examples: 70
download_size: 216527
dataset_size: 2036880.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Major-TOM/Core-S1RTC | ---
license: cc-by-sa-4.0
tags:
- earth-observation
- remote-sensing
- sentinel-1
- sar
- synthethic-aperture-radar
- satellite
size_categories:
- 1M<n<10M
dataset_info:
- config_name: default
features:
- name: product_id
dtype: string
- name: grid_cell
dtype: string
- name: product_datetime
dtype: string
- name: thumbnail
dtype: image
- name: vv
dtype: binary
- name: vh
dtype: binary
configs:
- config_name: default
data_files: images/*.parquet
- config_name: metadata
data_files: metadata.parquet
---
# Core-S1RTC
Contains a global coverage of Sentinel-1 (RTC) patches, each of size 1,068 x 1,068 pixels.
| Source | Sensing Type | Number of Patches | Patch Size | Total Pixels |
|--------|--------------|-------------------|------------|--------------|
|Sentinel-1 RTC | Synthetic Aperture Radar |1,469,955|1,068 x 1,068 (10 m) | > 1.676 Trillion |
## Content
| Column | Details | Resolution |
|--------|---------|------------|
| VV | Received Linear Power in the VV Polarization | 10m |
| VH | Received Linear Power in the VV Polarization | 10m |
| thumbnail | Rescaled false colour<sup>1</sup> saved as png | 10m |
<sup>1</sup> False colour composites are made with decibel-scale values with red green and blue defined as ```R:VV G:VV+VH B:VH```. For each channel, a contrast-stretch is applied, transforming minimum-maximum to 0-255. This means bluer areas have relatively higher VH values, whilst brightness is a function of overall intensity. This is relative within each thumbnail because of the normalisation, and so cannot be compared across different samples.
## Spatial Coverage
This is a global monotemporal dataset. Nearly every piece of Earth captured by Sentinel-1 is contained at least once in this dataset (and only once, excluding some marginal overlaps). The coverage is about 35% lower than for Core Sentinel-2 dataset due to the sensor coverage limitations.
The following figure demonstrates the spatial coverage (only black pixels are absent):

## Example Use
Interface scripts are available at https://github.com/ESA-PhiLab/Major-TOM
Here's a sneak peek with a thumbnail image:
```python
from fsspec.parquet import open_parquet_file
import pyarrow.parquet as pq
from io import BytesIO
from PIL import Image
PARQUET_FILE = 'part_03900' # parquet number
ROW_INDEX = 42 # row number (about 500 per parquet)
url = "https://huggingface.co/datasets/Major-TOM/Core-S1RTC/resolve/main/images/{}.parquet".format(PARQUET_FILE)
with open_parquet_file(url,columns = ["thumbnail"]) as f:
with pq.ParquetFile(f) as pf:
first_row_group = pf.read_row_group(ROW_INDEX, columns=['thumbnail'])
stream = BytesIO(first_row_group['thumbnail'][0].as_py())
image = Image.open(stream)
```
## Cite
[](https://arxiv.org/abs/2402.12095/)
```latex
@inproceedings{Major_TOM,
title={Major TOM: Expandable Datasets for Earth Observation},
author={Alistair Francis and Mikolaj Czerkawski},
year={2024},
eprint={2402.12095},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
Powered by [Φ-lab, European Space Agency (ESA) 🛰️](https://huggingface.co/ESA-philab) |
BNNT/mozi_general_instructions_3m | ---
license: apache-2.0
---
Sources are listed below:
Chinese General Instruction 2000k BELLE https://huggingface.co/datasets/BelleGroup/train_2M_CN
English generic instruction 52k alpaca-gpt4 https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM
Chinese generic dialog instructions 800k BELLE https://huggingface.co/datasets/BelleGroup/multiturn_chat_0.8M
English Universal Dialog Instruction 94k sharegpt_vicuna https://huggingface.co/datasets/jeffwan/sharegpt_vicuna
Chinese-English-Japanese Universal Command 49k https://huggingface.co/datasets/JosephusCheung/GuanacoDataset/tree/main |
VatsaDev/TinyText | ---
license: mit
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- code
size_categories:
- 1M<n<10M
---
The entire NanoPhi Dataset is at train.jsonl
Separate Tasks Include
- Math (Metamath, mammoth)
- Code (Code Search Net)
- Logic (Open-platypus)
- Roleplay (PIPPA, RoleplayIO)
- Textbooks (Tiny-text, Sciphi)
- Textbook QA (Orca-text, Tiny-webtext) |
izumi-lab/pile-modified | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: pile_set_name
dtype: string
splits:
- name: train
num_bytes: 1310018830416.1506
num_examples: 210330073
- name: validation
num_bytes: 1346933001.1794941
num_examples: 214369
- name: test
num_bytes: 1315474066.210365
num_examples: 214315
download_size: 104521763483
dataset_size: 1312681237483.5405
---
# Dataset Card for "pile-modified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fhai50032/dumb | ---
dataset_info:
features:
- name: text
dtype: string
- name: seed
dtype: string
- name: usecase
dtype: string
- name: source
dtype: string
- name: human-verified
dtype: string
splits:
- name: train
num_bytes: 8603295.0
num_examples: 3602
download_size: 3548186
dataset_size: 8603295.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lonestar108/anger | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validate
path: data/validate-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 6706
num_examples: 28
- name: test
num_bytes: 2899
num_examples: 10
- name: validate
num_bytes: 563
num_examples: 3
download_size: 12666
dataset_size: 10168
---
# Dataset Card for "new_anger"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rsuwaileh/IDRISI-RA | ---
license: apache-2.0
---
|
XShadow/RSSOD-Bench | ---
license: cc-by-4.0
---
RSSOD-Bench: a Large-Scale Benchmark Dataset for Salient Object Detection in Optical Remote Sensing Imagery
```
@inproceedings{xiong2023rssod,
title={RSSOD-Bench: a Large-Scale Benchmark Dataset for Salient Object Detection in Optical Remote Sensing Imagery},
author={Xiong, Zhitong and Liu, Yanfeng and Wang, Qi and Zhu, Xiao Xiang},
booktitle={IGARSS 2023-2023 IEEE International Geoscience and Remote Sensing Symposium},
pages={6549--6552},
year={2023},
organization={IEEE}
}
``` |
Nexdata/88_Hours_Mexican_Spanish_Conversational_Speech_Data_by_Telephone | ---
license: cc-by-nc-nd-4.0
---
## Description
Spanish(Mexico) Spontaneous Dialogue Telephony speech dataset, collected from dialogues based on given topics. Transcribed with text content, timestamp, speaker's ID, gender and other attributes. Our dataset was collected from extensive and diversify speakers(122 native speakers), geographicly speaking, enhancing model performance in real and complex tasks. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1352?source=Huggingface
## Format
8kHz 8bit, a-law/u-law pcm, mono channel
## Content category
Dialogue based on given topics
## Recording condition
Low background noise (indoor)
## Recording device
Telephony
## Country
Mexico(MEX)
## Language(Region) Code
es-MX
## Language
Spanish
## Speaker
122 people in total, 53% male and 47% female
## Features of annotation
Transcription text, timestamp, speaker ID, gender, noise
## Accuracy rate
Word accuracy rate(WAR) 98%
# Licensing Information
Commercial License
|
asthalochan/American_Sign_Language | ---
license: apache-2.0
---
|
shanjay/hf-stack-v1 | ---
dataset_info:
features:
- name: repo_id
dtype: 'null'
- name: file_path
dtype: 'null'
- name: content
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 930
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Minami-su__IA_14B | ---
pretty_name: Evaluation run of Minami-su/IA_14B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Minami-su/IA_14B](https://huggingface.co/Minami-su/IA_14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Minami-su__IA_14B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T15:35:10.890241](https://huggingface.co/datasets/open-llm-leaderboard/details_Minami-su__IA_14B/blob/main/results_2024-03-21T15-35-10.890241.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6725231661193563,\n\
\ \"acc_stderr\": 0.03173694433792602,\n \"acc_norm\": 0.6819399785391478,\n\
\ \"acc_norm_stderr\": 0.03237278026633433,\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6221767645042083,\n\
\ \"mc2_stderr\": 0.015637740339447825\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436178,\n\
\ \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407161\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6034654451304521,\n\
\ \"acc_stderr\": 0.004881780399499135,\n \"acc_norm\": 0.807010555666202,\n\
\ \"acc_norm_stderr\": 0.003938382581186512\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952929,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952929\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367405,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367405\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6510638297872341,\n \"acc_stderr\": 0.031158522131357783,\n\
\ \"acc_norm\": 0.6510638297872341,\n \"acc_norm_stderr\": 0.031158522131357783\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5767195767195767,\n \"acc_stderr\": 0.025446365634406776,\n \"\
acc_norm\": 0.5767195767195767,\n \"acc_norm_stderr\": 0.025446365634406776\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.832258064516129,\n\
\ \"acc_stderr\": 0.021255464065371318,\n \"acc_norm\": 0.832258064516129,\n\
\ \"acc_norm_stderr\": 0.021255464065371318\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.02519092111460393,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.02519092111460393\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687957,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687957\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4148148148148148,\n \"acc_stderr\": 0.03003984245406929,\n \
\ \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.03003984245406929\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7478991596638656,\n \"acc_stderr\": 0.028205545033277726,\n\
\ \"acc_norm\": 0.7478991596638656,\n \"acc_norm_stderr\": 0.028205545033277726\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700472,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700472\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944853,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944853\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990943,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990943\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973147,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973147\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.506145251396648,\n\
\ \"acc_stderr\": 0.01672123848363142,\n \"acc_norm\": 0.506145251396648,\n\
\ \"acc_norm_stderr\": 0.01672123848363142\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667864,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667864\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998483,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998483\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495033,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495033\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\
\ \"acc_stderr\": 0.012753716929101011,\n \"acc_norm\": 0.4745762711864407,\n\
\ \"acc_norm_stderr\": 0.012753716929101011\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.02888819310398864,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.02888819310398864\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904035,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904035\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826373,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826373\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6221767645042083,\n\
\ \"mc2_stderr\": 0.015637740339447825\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759984\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2896133434420015,\n \
\ \"acc_stderr\": 0.012493927348659627\n }\n}\n```"
repo_url: https://huggingface.co/Minami-su/IA_14B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|arc:challenge|25_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|gsm8k|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hellaswag|10_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-35-10.890241.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T15-35-10.890241.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- '**/details_harness|winogrande|5_2024-03-21T15-35-10.890241.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T15-35-10.890241.parquet'
- config_name: results
data_files:
- split: 2024_03_21T15_35_10.890241
path:
- results_2024-03-21T15-35-10.890241.parquet
- split: latest
path:
- results_2024-03-21T15-35-10.890241.parquet
---
# Dataset Card for Evaluation run of Minami-su/IA_14B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Minami-su/IA_14B](https://huggingface.co/Minami-su/IA_14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Minami-su__IA_14B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T15:35:10.890241](https://huggingface.co/datasets/open-llm-leaderboard/details_Minami-su__IA_14B/blob/main/results_2024-03-21T15-35-10.890241.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6725231661193563,
"acc_stderr": 0.03173694433792602,
"acc_norm": 0.6819399785391478,
"acc_norm_stderr": 0.03237278026633433,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6221767645042083,
"mc2_stderr": 0.015637740339447825
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436178,
"acc_norm": 0.6237201365187713,
"acc_norm_stderr": 0.014157022555407161
},
"harness|hellaswag|10": {
"acc": 0.6034654451304521,
"acc_stderr": 0.004881780399499135,
"acc_norm": 0.807010555666202,
"acc_norm_stderr": 0.003938382581186512
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952929,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952929
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367405,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367405
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6510638297872341,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.6510638297872341,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5767195767195767,
"acc_stderr": 0.025446365634406776,
"acc_norm": 0.5767195767195767,
"acc_norm_stderr": 0.025446365634406776
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.832258064516129,
"acc_stderr": 0.021255464065371318,
"acc_norm": 0.832258064516129,
"acc_norm_stderr": 0.021255464065371318
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.02519092111460393,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.02519092111460393
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687957,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687957
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.03003984245406929,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.03003984245406929
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7478991596638656,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.7478991596638656,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700472,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700472
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944853,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944853
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990943,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990943
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973147,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973147
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.506145251396648,
"acc_stderr": 0.01672123848363142,
"acc_norm": 0.506145251396648,
"acc_norm_stderr": 0.01672123848363142
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667864,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667864
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998483,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998483
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495033,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495033
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101011,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101011
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.02888819310398864,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.02888819310398864
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904035,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904035
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826373,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826373
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6221767645042083,
"mc2_stderr": 0.015637740339447825
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759984
},
"harness|gsm8k|5": {
"acc": 0.2896133434420015,
"acc_stderr": 0.012493927348659627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/10a6ef7d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1339
dataset_size: 186
---
# Dataset Card for "10a6ef7d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/ultrachat_200k_filtered_1710165106 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: query_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: train_sft
num_bytes: 11729352.076
num_examples: 403
- name: test_sft
num_bytes: 11837894.376
num_examples: 408
download_size: 4684099
dataset_size: 23567246.452
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=None,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=1024,
max_sft_query_response_length=1280,
max_sft_response_length=256,
max_rm_query_response_length=1280,
max_rm_response_length=256),
'push_to_hub': True}
```
|
pat-jj/ClinicalTrialSummary_Simple | ---
dataset_info:
features:
- name: article
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 147644069
num_examples: 62012
- name: validation
num_bytes: 19781190
num_examples: 7752
- name: test
num_bytes: 19929115
num_examples: 7752
download_size: 102569528
dataset_size: 187354374
---
# Dataset Card for "ClinicalTrialSummary_Simple"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/hunterxhunter | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of Hunter X Hunter
This is the image base of bangumi Hunter x Hunter, we detected 130 characters, 12906 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:----------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|
| 0 | 3471 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 541 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 306 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 363 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 123 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 154 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 103 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 123 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 52 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 66 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 82 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 29 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 50 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 246 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 31 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 85 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 21 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 45 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 40 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 42 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 61 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 99 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 20 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 118 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 48 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 35 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 142 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 1450 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 43 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 98 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 39 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 42 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 67 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 17 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 27 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 34 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 14 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 15 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 41 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 22 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 24 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 49 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 38 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 19 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 24 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 236 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 57 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 64 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 34 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 62 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 24 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 21 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 12 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 107 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 18 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 745 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 133 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 277 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 33 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 110 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 65 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 24 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 22 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 35 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 65 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 106 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 49 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 21 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 45 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 67 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 50 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 15 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 52 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 32 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 16 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 11 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 21 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 31 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 38 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 15 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 49 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 13 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 15 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 18 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 16 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 122 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 22 | [Download](86/dataset.zip) |  |  |  |  |  |  |  |  |
| 87 | 16 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 57 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 45 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 20 | [Download](90/dataset.zip) |  |  |  |  |  |  |  |  |
| 91 | 10 | [Download](91/dataset.zip) |  |  |  |  |  |  |  |  |
| 92 | 30 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 14 | [Download](93/dataset.zip) |  |  |  |  |  |  |  |  |
| 94 | 134 | [Download](94/dataset.zip) |  |  |  |  |  |  |  |  |
| 95 | 21 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 26 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 69 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| 98 | 8 | [Download](98/dataset.zip) |  |  |  |  |  |  |  |  |
| 99 | 17 | [Download](99/dataset.zip) |  |  |  |  |  |  |  |  |
| 100 | 18 | [Download](100/dataset.zip) |  |  |  |  |  |  |  |  |
| 101 | 5 | [Download](101/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 102 | 27 | [Download](102/dataset.zip) |  |  |  |  |  |  |  |  |
| 103 | 25 | [Download](103/dataset.zip) |  |  |  |  |  |  |  |  |
| 104 | 60 | [Download](104/dataset.zip) |  |  |  |  |  |  |  |  |
| 105 | 25 | [Download](105/dataset.zip) |  |  |  |  |  |  |  |  |
| 106 | 16 | [Download](106/dataset.zip) |  |  |  |  |  |  |  |  |
| 107 | 11 | [Download](107/dataset.zip) |  |  |  |  |  |  |  |  |
| 108 | 25 | [Download](108/dataset.zip) |  |  |  |  |  |  |  |  |
| 109 | 14 | [Download](109/dataset.zip) |  |  |  |  |  |  |  |  |
| 110 | 53 | [Download](110/dataset.zip) |  |  |  |  |  |  |  |  |
| 111 | 23 | [Download](111/dataset.zip) |  |  |  |  |  |  |  |  |
| 112 | 21 | [Download](112/dataset.zip) |  |  |  |  |  |  |  |  |
| 113 | 12 | [Download](113/dataset.zip) |  |  |  |  |  |  |  |  |
| 114 | 9 | [Download](114/dataset.zip) |  |  |  |  |  |  |  |  |
| 115 | 13 | [Download](115/dataset.zip) |  |  |  |  |  |  |  |  |
| 116 | 47 | [Download](116/dataset.zip) |  |  |  |  |  |  |  |  |
| 117 | 11 | [Download](117/dataset.zip) |  |  |  |  |  |  |  |  |
| 118 | 6 | [Download](118/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 119 | 6 | [Download](119/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 120 | 8 | [Download](120/dataset.zip) |  |  |  |  |  |  |  |  |
| 121 | 54 | [Download](121/dataset.zip) |  |  |  |  |  |  |  |  |
| 122 | 25 | [Download](122/dataset.zip) |  |  |  |  |  |  |  |  |
| 123 | 53 | [Download](123/dataset.zip) |  |  |  |  |  |  |  |  |
| 124 | 8 | [Download](124/dataset.zip) |  |  |  |  |  |  |  |  |
| 125 | 16 | [Download](125/dataset.zip) |  |  |  |  |  |  |  |  |
| 126 | 22 | [Download](126/dataset.zip) |  |  |  |  |  |  |  |  |
| 127 | 6 | [Download](127/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 128 | 57 | [Download](128/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 236 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
researchjyotsna/isic2018_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 2553258.0
num_examples: 10
download_size: 2553780
dataset_size: 2553258.0
---
# Dataset Card for "isic2018_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713033735 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11628
num_examples: 28
download_size: 9431
dataset_size: 11628
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713033735"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-open_instruct | ---
pretty_name: Evaluation run of BEE-spoke-data/smol_llama-220M-open_instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BEE-spoke-data/smol_llama-220M-open_instruct](https://huggingface.co/BEE-spoke-data/smol_llama-220M-open_instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-open_instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T13:39:47.179873](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-open_instruct/blob/main/results_2024-01-04T13-39-47.179873.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25998788076923746,\n\
\ \"acc_stderr\": 0.030908234134550842,\n \"acc_norm\": 0.2615422501208712,\n\
\ \"acc_norm_stderr\": 0.03173823021382238,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.4406371478334913,\n\
\ \"mc2_stderr\": 0.015537102899912702\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.19283276450511946,\n \"acc_stderr\": 0.011529055465663345,\n\
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.012653835621466646\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27972515435172274,\n\
\ \"acc_stderr\": 0.004479467619464786,\n \"acc_norm\": 0.29705238000398326,\n\
\ \"acc_norm_stderr\": 0.00456025908319737\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n\
\ \"acc_stderr\": 0.03861229196653696,\n \"acc_norm\": 0.18,\n \
\ \"acc_norm_stderr\": 0.03861229196653696\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.02914690474779834,\n\
\ \"acc_norm\": 0.33962264150943394,\n \"acc_norm_stderr\": 0.02914690474779834\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n\
\ \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.267741935483871,\n\
\ \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.031785297106427496,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.031785297106427496\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.032752644677915145,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.032752644677915145\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3321100917431193,\n \"acc_stderr\": 0.020192682985423344,\n \"\
acc_norm\": 0.3321100917431193,\n \"acc_norm_stderr\": 0.020192682985423344\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.02860951671699494,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.02860951671699494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n\
\ \"acc_stderr\": 0.03006958487449403,\n \"acc_norm\": 0.27802690582959644,\n\
\ \"acc_norm_stderr\": 0.03006958487449403\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.026453508054040346,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.026453508054040346\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2247765006385696,\n\
\ \"acc_stderr\": 0.01492744710193717,\n \"acc_norm\": 0.2247765006385696,\n\
\ \"acc_norm_stderr\": 0.01492744710193717\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.022497230190967547,\n\
\ \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.022497230190967547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824768,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824768\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n\
\ \"acc_stderr\": 0.010936550813827066,\n \"acc_norm\": 0.24185136897001303,\n\
\ \"acc_norm_stderr\": 0.010936550813827066\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2434640522875817,\n \"acc_stderr\": 0.017362473762146634,\n \
\ \"acc_norm\": 0.2434640522875817,\n \"acc_norm_stderr\": 0.017362473762146634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3346938775510204,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.3346938775510204,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18072289156626506,\n\
\ \"acc_stderr\": 0.02995573785581014,\n \"acc_norm\": 0.18072289156626506,\n\
\ \"acc_norm_stderr\": 0.02995573785581014\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.01500067437357034,\n \"mc2\": 0.4406371478334913,\n\
\ \"mc2_stderr\": 0.015537102899912702\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5027624309392266,\n \"acc_stderr\": 0.014052271211616441\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BEE-spoke-data/smol_llama-220M-open_instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-39-47.179873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-39-47.179873.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- '**/details_harness|winogrande|5_2024-01-04T13-39-47.179873.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T13-39-47.179873.parquet'
- config_name: results
data_files:
- split: 2024_01_04T13_39_47.179873
path:
- results_2024-01-04T13-39-47.179873.parquet
- split: latest
path:
- results_2024-01-04T13-39-47.179873.parquet
---
# Dataset Card for Evaluation run of BEE-spoke-data/smol_llama-220M-open_instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BEE-spoke-data/smol_llama-220M-open_instruct](https://huggingface.co/BEE-spoke-data/smol_llama-220M-open_instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-open_instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T13:39:47.179873](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__smol_llama-220M-open_instruct/blob/main/results_2024-01-04T13-39-47.179873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25998788076923746,
"acc_stderr": 0.030908234134550842,
"acc_norm": 0.2615422501208712,
"acc_norm_stderr": 0.03173823021382238,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.4406371478334913,
"mc2_stderr": 0.015537102899912702
},
"harness|arc:challenge|25": {
"acc": 0.19283276450511946,
"acc_stderr": 0.011529055465663345,
"acc_norm": 0.25,
"acc_norm_stderr": 0.012653835621466646
},
"harness|hellaswag|10": {
"acc": 0.27972515435172274,
"acc_stderr": 0.004479467619464786,
"acc_norm": 0.29705238000398326,
"acc_norm_stderr": 0.00456025908319737
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653696,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653696
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.02914690474779834,
"acc_norm": 0.33962264150943394,
"acc_norm_stderr": 0.02914690474779834
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.031785297106427496,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.031785297106427496
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.032752644677915145,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.032752644677915145
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3321100917431193,
"acc_stderr": 0.020192682985423344,
"acc_norm": 0.3321100917431193,
"acc_norm_stderr": 0.020192682985423344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.02860951671699494,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.02860951671699494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.03006958487449403,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.03006958487449403
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19008264462809918,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.19008264462809918,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.026453508054040346,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.026453508054040346
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2247765006385696,
"acc_stderr": 0.01492744710193717,
"acc_norm": 0.2247765006385696,
"acc_norm_stderr": 0.01492744710193717
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.022497230190967547,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.022497230190967547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824768,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824768
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827066,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827066
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2434640522875817,
"acc_stderr": 0.017362473762146634,
"acc_norm": 0.2434640522875817,
"acc_norm_stderr": 0.017362473762146634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3346938775510204,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.3346938775510204,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18072289156626506,
"acc_stderr": 0.02995573785581014,
"acc_norm": 0.18072289156626506,
"acc_norm_stderr": 0.02995573785581014
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.01500067437357034,
"mc2": 0.4406371478334913,
"mc2_stderr": 0.015537102899912702
},
"harness|winogrande|5": {
"acc": 0.5027624309392266,
"acc_stderr": 0.014052271211616441
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Charlie911__MultiLora-drop-sharegpt | ---
pretty_name: Evaluation run of Charlie911/MultiLora-drop-sharegpt
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Charlie911/MultiLora-drop-sharegpt](https://huggingface.co/Charlie911/MultiLora-drop-sharegpt)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Charlie911__MultiLora-drop-sharegpt\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T20:13:52.401722](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLora-drop-sharegpt/blob/main/results_2024-01-23T20-13-52.401722.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4103189833137969,\n\
\ \"acc_stderr\": 0.034440356205935406,\n \"acc_norm\": 0.4152994817197388,\n\
\ \"acc_norm_stderr\": 0.03526445965091106,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.44825405044907884,\n\
\ \"mc2_stderr\": 0.014892271476699756\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4445392491467577,\n \"acc_stderr\": 0.014521226405627077,\n\
\ \"acc_norm\": 0.4761092150170648,\n \"acc_norm_stderr\": 0.014594701798071655\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49302927703644694,\n\
\ \"acc_stderr\": 0.004989296471157074,\n \"acc_norm\": 0.6597291376219877,\n\
\ \"acc_norm_stderr\": 0.0047283185778352246\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389188,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389188\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.0394170763206489,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.0394170763206489\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.02843867799890955,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.02843867799890955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n\
\ \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.46060606060606063,\n \"acc_stderr\": 0.03892207016552012,\n\
\ \"acc_norm\": 0.46060606060606063,\n \"acc_norm_stderr\": 0.03892207016552012\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.5233160621761658,\n \"acc_stderr\": 0.03604513672442202,\n\
\ \"acc_norm\": 0.5233160621761658,\n \"acc_norm_stderr\": 0.03604513672442202\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.02446861524147891,\n\
\ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.02446861524147891\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4917431192660551,\n \"acc_stderr\": 0.021434399918214334,\n \"\
acc_norm\": 0.4917431192660551,\n \"acc_norm_stderr\": 0.021434399918214334\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5147058823529411,\n \"acc_stderr\": 0.035077938347913236,\n \"\
acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.035077938347913236\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.569620253164557,\n \"acc_stderr\": 0.032230171959375976,\n \
\ \"acc_norm\": 0.569620253164557,\n \"acc_norm_stderr\": 0.032230171959375976\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550989,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550989\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"\
acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.04812917324536821,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.04812917324536821\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.03847021420456026,\n\
\ \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.03847021420456026\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5512820512820513,\n\
\ \"acc_stderr\": 0.032583346493868806,\n \"acc_norm\": 0.5512820512820513,\n\
\ \"acc_norm_stderr\": 0.032583346493868806\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5351213282247765,\n\
\ \"acc_stderr\": 0.017835798806290645,\n \"acc_norm\": 0.5351213282247765,\n\
\ \"acc_norm_stderr\": 0.017835798806290645\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.40173410404624277,\n \"acc_stderr\": 0.02639410417764363,\n\
\ \"acc_norm\": 0.40173410404624277,\n \"acc_norm_stderr\": 0.02639410417764363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249594,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249594\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02845263998508801,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02845263998508801\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4630225080385852,\n\
\ \"acc_stderr\": 0.028320325830105908,\n \"acc_norm\": 0.4630225080385852,\n\
\ \"acc_norm_stderr\": 0.028320325830105908\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.404320987654321,\n \"acc_stderr\": 0.02730662529732769,\n\
\ \"acc_norm\": 0.404320987654321,\n \"acc_norm_stderr\": 0.02730662529732769\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3049645390070922,\n \"acc_stderr\": 0.027464708442022128,\n \
\ \"acc_norm\": 0.3049645390070922,\n \"acc_norm_stderr\": 0.027464708442022128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3409387222946545,\n\
\ \"acc_stderr\": 0.01210681720306721,\n \"acc_norm\": 0.3409387222946545,\n\
\ \"acc_norm_stderr\": 0.01210681720306721\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3366013071895425,\n \"acc_stderr\": 0.019117213911495158,\n \
\ \"acc_norm\": 0.3366013071895425,\n \"acc_norm_stderr\": 0.019117213911495158\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5621890547263682,\n\
\ \"acc_stderr\": 0.0350808011219984,\n \"acc_norm\": 0.5621890547263682,\n\
\ \"acc_norm_stderr\": 0.0350808011219984\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5497076023391813,\n \"acc_stderr\": 0.038158273659132366,\n\
\ \"acc_norm\": 0.5497076023391813,\n \"acc_norm_stderr\": 0.038158273659132366\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.44825405044907884,\n\
\ \"mc2_stderr\": 0.014892271476699756\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6606156274664562,\n \"acc_stderr\": 0.01330771492894175\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06444275966641395,\n \
\ \"acc_stderr\": 0.006763391728488269\n }\n}\n```"
repo_url: https://huggingface.co/Charlie911/MultiLora-drop-sharegpt
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|arc:challenge|25_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|arc:challenge|25_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|gsm8k|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|gsm8k|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hellaswag|10_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hellaswag|10_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T13-09-40.309732.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-13-52.401722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T20-13-52.401722.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- '**/details_harness|winogrande|5_2024-01-23T13-09-40.309732.parquet'
- split: 2024_01_23T20_13_52.401722
path:
- '**/details_harness|winogrande|5_2024-01-23T20-13-52.401722.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T20-13-52.401722.parquet'
- config_name: results
data_files:
- split: 2024_01_23T13_09_40.309732
path:
- results_2024-01-23T13-09-40.309732.parquet
- split: 2024_01_23T20_13_52.401722
path:
- results_2024-01-23T20-13-52.401722.parquet
- split: latest
path:
- results_2024-01-23T20-13-52.401722.parquet
---
# Dataset Card for Evaluation run of Charlie911/MultiLora-drop-sharegpt
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Charlie911/MultiLora-drop-sharegpt](https://huggingface.co/Charlie911/MultiLora-drop-sharegpt) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Charlie911__MultiLora-drop-sharegpt",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T20:13:52.401722](https://huggingface.co/datasets/open-llm-leaderboard/details_Charlie911__MultiLora-drop-sharegpt/blob/main/results_2024-01-23T20-13-52.401722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4103189833137969,
"acc_stderr": 0.034440356205935406,
"acc_norm": 0.4152994817197388,
"acc_norm_stderr": 0.03526445965091106,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.44825405044907884,
"mc2_stderr": 0.014892271476699756
},
"harness|arc:challenge|25": {
"acc": 0.4445392491467577,
"acc_stderr": 0.014521226405627077,
"acc_norm": 0.4761092150170648,
"acc_norm_stderr": 0.014594701798071655
},
"harness|hellaswag|10": {
"acc": 0.49302927703644694,
"acc_stderr": 0.004989296471157074,
"acc_norm": 0.6597291376219877,
"acc_norm_stderr": 0.0047283185778352246
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389188,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389188
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.0394170763206489,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.0394170763206489
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.02843867799890955,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.02843867799890955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.46060606060606063,
"acc_stderr": 0.03892207016552012,
"acc_norm": 0.46060606060606063,
"acc_norm_stderr": 0.03892207016552012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5233160621761658,
"acc_stderr": 0.03604513672442202,
"acc_norm": 0.5233160621761658,
"acc_norm_stderr": 0.03604513672442202
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.02446861524147891,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.02446861524147891
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4917431192660551,
"acc_stderr": 0.021434399918214334,
"acc_norm": 0.4917431192660551,
"acc_norm_stderr": 0.021434399918214334
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.035077938347913236,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.035077938347913236
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.569620253164557,
"acc_stderr": 0.032230171959375976,
"acc_norm": 0.569620253164557,
"acc_norm_stderr": 0.032230171959375976
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550989,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550989
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.04564198767432754,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.04564198767432754
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536821,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536821
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.03847021420456026,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.03847021420456026
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.032583346493868806,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.032583346493868806
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5351213282247765,
"acc_stderr": 0.017835798806290645,
"acc_norm": 0.5351213282247765,
"acc_norm_stderr": 0.017835798806290645
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.40173410404624277,
"acc_stderr": 0.02639410417764363,
"acc_norm": 0.40173410404624277,
"acc_norm_stderr": 0.02639410417764363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249594,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249594
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4630225080385852,
"acc_stderr": 0.028320325830105908,
"acc_norm": 0.4630225080385852,
"acc_norm_stderr": 0.028320325830105908
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.404320987654321,
"acc_stderr": 0.02730662529732769,
"acc_norm": 0.404320987654321,
"acc_norm_stderr": 0.02730662529732769
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3049645390070922,
"acc_stderr": 0.027464708442022128,
"acc_norm": 0.3049645390070922,
"acc_norm_stderr": 0.027464708442022128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3409387222946545,
"acc_stderr": 0.01210681720306721,
"acc_norm": 0.3409387222946545,
"acc_norm_stderr": 0.01210681720306721
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3366013071895425,
"acc_stderr": 0.019117213911495158,
"acc_norm": 0.3366013071895425,
"acc_norm_stderr": 0.019117213911495158
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.4,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5621890547263682,
"acc_stderr": 0.0350808011219984,
"acc_norm": 0.5621890547263682,
"acc_norm_stderr": 0.0350808011219984
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5497076023391813,
"acc_stderr": 0.038158273659132366,
"acc_norm": 0.5497076023391813,
"acc_norm_stderr": 0.038158273659132366
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.44825405044907884,
"mc2_stderr": 0.014892271476699756
},
"harness|winogrande|5": {
"acc": 0.6606156274664562,
"acc_stderr": 0.01330771492894175
},
"harness|gsm8k|5": {
"acc": 0.06444275966641395,
"acc_stderr": 0.006763391728488269
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joey234/mmlu-professional_psychology-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 10024
num_examples: 5
- name: test
num_bytes: 7470201
num_examples: 612
download_size: 555481
dataset_size: 7480225
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-professional_psychology-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sbeyn/novel17 | ---
license: mit
---
|
Hack90/ncbi_genbank_part_17 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 9388521444
num_examples: 13063434
download_size: 3796686652
dataset_size: 9388521444
---
# Dataset Card for "ncbi_genbank_part_17"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-college_chemistry-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 2893
num_examples: 5
download_size: 0
dataset_size: 2893
---
# Dataset Card for "mmlu-college_chemistry-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceM4/TabMWP | Invalid username or password. |
StephanAkkerman/financial-tweets-other | ---
license: mit
---
|
suwesh/IoT-based-SmartParkingSystem-dataset | ---
license: ecl-2.0
---
About 2 years of IoT-based smart parking lot usage data collected on ThingSpeak which has the slot availability information with respect to timestamps. |
MatthewWaller/cifar_stable_diffusion | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 136240130
num_examples: 60000
download_size: 137069319
dataset_size: 136240130
task_categories:
- text-to-image
size_categories:
- n<1K
--- |
mrbizapps/resume | ---
license: mit
---
|
drzraf/petfinder-dogs | ---
annotations_creators: []
language_creators:
- crowdsourced
license:
- unknown
multilinguality:
- monolingual
pretty_name: 300px dogs photos from Petfinder
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- pets
- dogs
- animals
- photos
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
# Dataset Card for "petfinder-dogs"
## Dataset Description
- **Homepage:** https://www.petfinder.com/
- **Paper:** N.A.
- **Leaderboard:** N.A.
- **Point of Contact:** N.A.
### Dataset Summary
Contains 700k+ 300px-wide images of 150k+ distinct dogs extracted from the PetFinder API in March 2023.
Only those having at least 4 photos are present: Each subject has between 4 and 12 photos.
This dataset aims to simplify AI work based on dogs' images and avoid rescraping thousands of them from the PetFinder API again and again.
|
lchakkei/OpenOrca-Traditional-Chinese-ChatML-Format | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6932005281
num_examples: 4233915
download_size: 3999084953
dataset_size: 6932005281
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
npvinHnivqn/VietnameseDictionary | ---
language:
- vi
size_categories:
- 20K<n<40K
---
- This dataset includes ~30k Vietnamese words and definitions |
vigneshgs7/Boundary_detection_Doc_6 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 13135529127.0
num_examples: 264
download_size: 867939073
dataset_size: 13135529127.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pyakymenko/new_test_repo | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 39116602.0
num_examples: 502
download_size: 38127697
dataset_size: 39116602.0
---
# Dataset Card for "new_test_repo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MaulikMadhavi/autotrain-data-sample | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: sample
## Dataset Description
This dataset has been automatically processed by AutoTrain for project sample.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<500x375 RGB PIL image>",
"target": 1
},
{
"image": "<378x274 RGB PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['cat', 'dog'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 80 |
| valid | 20 |
|
AdapterOcean/Open_Platypus_standardized_cluster_9_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3879349
num_examples: 10191
download_size: 1767734
dataset_size: 3879349
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_9_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001 | ---
pretty_name: Evaluation run of alchemonaut/QuartetAnemoi-70B-t0.0001
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alchemonaut/QuartetAnemoi-70B-t0.0001](https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T09:33:24.428024](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001/blob/main/results_2024-02-04T09-33-24.428024.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7529733519925265,\n\
\ \"acc_stderr\": 0.028560689453846086,\n \"acc_norm\": 0.7560953011110824,\n\
\ \"acc_norm_stderr\": 0.029110491550081063,\n \"mc1\": 0.5373317013463892,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6953224067002621,\n\
\ \"mc2_stderr\": 0.014718923922056508\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523214\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7132045409281019,\n\
\ \"acc_stderr\": 0.004513409114983832,\n \"acc_norm\": 0.8889663413662617,\n\
\ \"acc_norm_stderr\": 0.0031353173122281234\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.6962962962962963,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.03016753346863271,\n\
\ \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.03016753346863271\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866514,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866514\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n\
\ \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \
\ \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818318,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818318\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n\
\ \"acc_stderr\": 0.04959859966384181,\n \"acc_norm\": 0.46078431372549017,\n\
\ \"acc_norm_stderr\": 0.04959859966384181\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7489361702127659,\n\
\ \"acc_stderr\": 0.02834696377716245,\n \"acc_norm\": 0.7489361702127659,\n\
\ \"acc_norm_stderr\": 0.02834696377716245\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n\
\ \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.7379310344827587,\n \"acc_stderr\": 0.036646663372252565,\n \"\
acc_norm\": 0.7379310344827587,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.025680564640056882,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.025680564640056882\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8741935483870967,\n\
\ \"acc_stderr\": 0.01886583428803001,\n \"acc_norm\": 0.8741935483870967,\n\
\ \"acc_norm_stderr\": 0.01886583428803001\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.033442837442804574,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.033442837442804574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\"\
: 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781675,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781675\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.021469735576055332,\n \"\
acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055332\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.0210206726808279,\n \
\ \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.0210206726808279\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4148148148148148,\n \"acc_stderr\": 0.030039842454069283,\n \
\ \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.030039842454069283\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.021863258494852128,\n\
\ \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.021863258494852128\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769553,\n \"\
acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769553\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7129629629629629,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473335,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473335\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n\
\ \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9256198347107438,\n \"acc_stderr\": 0.02395268883667674,\n \"\
acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.02395268883667674\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971723,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971723\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.6607142857142857,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625845,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625845\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8978288633461047,\n\
\ \"acc_stderr\": 0.01083072471313418,\n \"acc_norm\": 0.8978288633461047,\n\
\ \"acc_norm_stderr\": 0.01083072471313418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442272,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442272\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.646927374301676,\n\
\ \"acc_stderr\": 0.015984204545268575,\n \"acc_norm\": 0.646927374301676,\n\
\ \"acc_norm_stderr\": 0.015984204545268575\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514307,\n\
\ \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n\
\ \"acc_stderr\": 0.021514051585970393,\n \"acc_norm\": 0.8263665594855305,\n\
\ \"acc_norm_stderr\": 0.021514051585970393\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544533,\n\
\ \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6099290780141844,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.6099290780141844,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5912646675358539,\n\
\ \"acc_stderr\": 0.012555701346703387,\n \"acc_norm\": 0.5912646675358539,\n\
\ \"acc_norm_stderr\": 0.012555701346703387\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8333333333333334,\n \"acc_stderr\": 0.015076937921915376,\n \
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.015076937921915376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650156,\n\
\ \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650156\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9253731343283582,\n\
\ \"acc_stderr\": 0.018581939698490618,\n \"acc_norm\": 0.9253731343283582,\n\
\ \"acc_norm_stderr\": 0.018581939698490618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759057,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759057\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5373317013463892,\n\
\ \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.6953224067002621,\n\
\ \"mc2_stderr\": 0.014718923922056508\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250693\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \
\ \"acc_stderr\": 0.012782681251053191\n }\n}\n```"
repo_url: https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|arc:challenge|25_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|gsm8k|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hellaswag|10_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T09-33-24.428024.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- '**/details_harness|winogrande|5_2024-02-04T09-33-24.428024.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T09-33-24.428024.parquet'
- config_name: results
data_files:
- split: 2024_02_04T09_33_24.428024
path:
- results_2024-02-04T09-33-24.428024.parquet
- split: latest
path:
- results_2024-02-04T09-33-24.428024.parquet
---
# Dataset Card for Evaluation run of alchemonaut/QuartetAnemoi-70B-t0.0001
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alchemonaut/QuartetAnemoi-70B-t0.0001](https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T09:33:24.428024](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001/blob/main/results_2024-02-04T09-33-24.428024.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7529733519925265,
"acc_stderr": 0.028560689453846086,
"acc_norm": 0.7560953011110824,
"acc_norm_stderr": 0.029110491550081063,
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6953224067002621,
"mc2_stderr": 0.014718923922056508
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292038,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523214
},
"harness|hellaswag|10": {
"acc": 0.7132045409281019,
"acc_stderr": 0.004513409114983832,
"acc_norm": 0.8889663413662617,
"acc_norm_stderr": 0.0031353173122281234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.03016753346863271,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.03016753346863271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866514,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866514
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7489361702127659,
"acc_stderr": 0.02834696377716245,
"acc_norm": 0.7489361702127659,
"acc_norm_stderr": 0.02834696377716245
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7379310344827587,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.7379310344827587,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8741935483870967,
"acc_stderr": 0.01886583428803001,
"acc_norm": 0.8741935483870967,
"acc_norm_stderr": 0.01886583428803001
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781675,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781675
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.021469735576055332,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.021469735576055332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.0210206726808279,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.0210206726808279
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.030039842454069283,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.030039842454069283
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.021863258494852128,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.021863258494852128
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769553,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769553
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.018094247116473335,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.018094247116473335
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.02624113299640726,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.02624113299640726
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9256198347107438,
"acc_stderr": 0.02395268883667674,
"acc_norm": 0.9256198347107438,
"acc_norm_stderr": 0.02395268883667674
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971723,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625845,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8978288633461047,
"acc_stderr": 0.01083072471313418,
"acc_norm": 0.8978288633461047,
"acc_norm_stderr": 0.01083072471313418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442272,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442272
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.646927374301676,
"acc_stderr": 0.015984204545268575,
"acc_norm": 0.646927374301676,
"acc_norm_stderr": 0.015984204545268575
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.021668400256514307,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.021668400256514307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.021514051585970393,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.021514051585970393
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.019242526226544533,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.019242526226544533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6099290780141844,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.6099290780141844,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5912646675358539,
"acc_stderr": 0.012555701346703387,
"acc_norm": 0.5912646675358539,
"acc_norm_stderr": 0.012555701346703387
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.015076937921915376,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.015076937921915376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650156,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650156
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9253731343283582,
"acc_stderr": 0.018581939698490618,
"acc_norm": 0.9253731343283582,
"acc_norm_stderr": 0.018581939698490618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759057,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759057
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.6953224067002621,
"mc2_stderr": 0.014718923922056508
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250693
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053191
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yh0701/FracAtlas_dataset | ---
license: cc-by-2.5
task_categories:
- image-classification
- image-segmentation
- image-feature-extraction
language:
- en
tags:
- biology
- X-Ray
size_categories:
- 1K<n<10K
---
# Dataset Card for FracAtlas
<!-- Provide a quick summary of the dataset. -->
The "FracAtlas" dataset is a collection of musculoskeletal radiographs for bone fracture classification, localization, and segmentation.
It includes 4,083 X-Ray images (717 fracture images) with corresponding annotations in multiple formats. The annotations include segmentations, width, and etc in COCO, VGG, YOLO, and Pascal VOC format.
The dataset is intended for use in deep learning tasks in medical imaging, specifically targeting the understanding of bone fractures.
It is freely available under a CC-BY 4.0 license.
This script provides a Hugging Face `datasets` loader for the FracAtlas dataset. The generated dataset includes high-quality X-Ray images and incorporate detailed annotations from COCO JSON format for segmentation
and bounding box information, as well as additional localization data from PASCAL VOC XML files. The loader handles downloading and preparing the dataset, making it readily available for machine learning models and analysis
tasks in medical imaging, especially focusing on the detection and understanding of bone fractures.
- **Curated by:** Abedeen, Iftekharul; Rahman, Md. Ashiqur; Zohra Prottyasha, Fatema; Ahmed, Tasnim; Mohmud Chowdhury, Tareque; Shatabda, Swakkhar
- **License:** cc-by-2.5
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
The source data for the "FracAtlas" dataset is hosted on Figshare, an online digital repository where researchers can preserve and share their research outputs, including datasets. The FracAtlas dataset is freely accessible under a CC-BY 4.0 license, allowing for widespread use in the scientific community, particularly among researchers and practitioners in medical imaging and related fields.
The data had created, cleaned, and managed by Iftekharul Abedeen, Md. Ashiqur Rahman, Fatema Zohra Prottyasha, Tasnim Ahmed, Tareque Mohmud Chowdhury & Swakkhar Shatabda. More details related to Data Collection & Annotation can be seen in ###Source Data section.
- **Repository:** https://figshare.com/articles/dataset/The_dataset/22363012
- **Paper:** https://www.nature.com/articles/s41597-023-02432-4
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The "FracAtlas" dataset can be used to develop multiple machine learning or deep learning algorithms. For example:
1. Developing a deep learning model to automatically detect fractures in radiographs.
2. Classifying the type of fractures (e.g., hairline, compound, transverse) using machine learning models
3. Implementing segmentation models to delineate bone structures from the surrounding tissues in the radiographs
4. Forecasting patients’ outcomes based on the characteristics of the fracture and other patient data
5. Developing models to identify anomalous patterns in the radiographs of bones
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
### Original Dataset Schema
The original zip file contains 3 subfolders “images”, “Annotations”, “utilities” and a “dataset.csv” file.
In the "image" folder, it contains 2 subfolders: "Fractured" and "Non-fractured", and each image is stored in corresponding folder in JPG format.
In the "Annotations" folder, it contains 4 subfolders: "COCO JSON", "PASCAL VOC", "VGG JSON", and "YOLO". Annotations are stored in their corresponding folders. More details can be read in ### Annotations section.
In the "utilities" folder, it contains several programming scripts that could be used to convert the raw files to a more readable format. None of them were used in this dataset loader.
The "dataset.csv" contains many basic variables for each image: <br />
- **image_id [string]:** The unique identifier for each radiograph image in the dataset. <br />
- **hand[int]:** A binary indicators (1 or 0) marking the presence of hand in the radiograph <br />
- **leg [int]:** A binary indicators (1 or 0) marking the presence of leg in the radiograph <br />
- **hip [int]:** A binary indicators (1 or 0) marking the presence of hip in the radiograph <br />
- **shoulder [int]:** A binary indicator (1 or 0) marking the shoulder in the radiograph <br />
- **mixed [int]:** A binary indicator of whether the image contains multiple body parts <br />
- **hardware [int]:** A binary indicator marking the presence of medical hardware (i.e. screws or plates) in the image <br />
- **multiscan [int]:** A binary indicator signifies whether the image is part of a set of multiple scans <br />
- **fractured [int]:** A binary indicator of whether there is a fracture present in the image <br />
- **fracture_count [int]:** The number of fractures present in the image <br />
- **frontal [int]:** A binary indicator denoting the front orientation of the radiograph <br />
- **lateral[int]:** A binary indicator denoting the side orientation of the radiograph <br />
- **oblique [int]:** A binary indicator denoting denoting the angled orientation of the radiograph <br />
### Updated Dataset Schema
In this dataset loader, certain existed variables are extracted from the orginal "dataset.csv" and then modified into specific Huggingface feature class for clarity, for instance ClassLabel.
Other important variables are extracted from other downloaded files in the "FracAtlas" zip file to present a more systematic and clean FracAtlas dataset.
The full schema of the HuggingFace dataset loader is below:
- **image_id [string]:** The unique identifier for each radiograph image in the dataset. <br />
- **Image [image]:** A PIL image object denoting each X-ray image. This can be used to load the image file directly. <br />
- **hand[ClassLabel]:** A binary indicators (1 or 0) marking the presence of hand in the radiograph <br />
- **leg [ClassLabel]:** A binary indicators (1 or 0) marking the presence of leg in the radiograph <br />
- **hip [ClassLabel]:** A binary indicators (1 or 0) marking the presence of hip in the radiograph <br />
- **shoulder [ClassLabel]:** A binary indicator (1 or 0) marking the shoulder in the radiograph <br />
- **mixed [ClassLabel]:** A binary indicator of whether the image contains multiple body parts <br />
- **hardware [ClassLabel]:** A binary indicator marking the presence of medical hardware (i.e. screws or plates) in the image <br />
- **multiscan [ClassLabel]:** A binary indicator signifies whether the image is part of a set of multiple scans <br />
- **fractured [ClassLabel]:** A binary indicator of whether there is a fracture present in the image <br />
- **fracture_count [int]:** The number of fractures present in the image <br />
- **frontal [ClassLabel]:** A binary indicator (1 or 0) denoting the front orientation of the radiograph <br />
- **lateral[ClassLabel]:** A binary indicator (1 or 0) denoting the side orientation of the radiograph <br />
- **oblique [ClassLabel]:** A binary indicator (1 or 0) denoting denoting the angled orientation of the radiograph <br />
- **localization_metadata [dict/Features]:** Metadata about the image localization, including 1) width(int), height (int), and depth (int) of the image
- **segmentation_metadata[dict/Features]:** Metadata about the segmentation, including the 1) segmentation(Sequence of Sequence of floats), 2) bounding box(Sequence of floats), and 3) area(float) covered by the segmentation. This can be None if no segmentation data is available
Also, we should note that even though the authors claim that annotations are provided only for images with fractures, it is worth-noting that some of the non-fracture images also have annotation data, and some of the fracture images do not. Therefore, to maximize the integrity of the data, both **Fractured** and **Segmentation_metadata** are kept for users. That is probably because annotations are done manlually and thus subject to errors, as the authors mentioned in the corresponding paper.
Furthermore, **hand**, **leg**, **hip**, and **shoulder** are not mutually exclusive, so they are stored as independent variables.
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The creation of the FracAtlas dataset was driven by the need for a comprehensive and specialized collection of medical images to train machine learning models for fracture detection. The dataset aims to address the gap in the availability of annotated musculoskeletal radiographs necessary for advancing AI-assisted diagnostic tools. The choices involved in its assembly, including the selection of specific types of radiographs and the detailed annotation process, were governed by the objective of providing a robust resource that can significantly improve the accuracy and efficiency of fracture diagnosis in the medical field.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
In the initial phase, a total of 14,068 X-Rays were collected.
Due to privacy concerns, all the DICOM images were given an arbitrary image name and converted to JPG image format.
This automatically got rid of all the sensitive information that was present in the metadata of DICOM images.
These conversions were done using the proprietary software of the corresponding X-ray machines.
The renamed DICOM images were stored in the hospital database separately for later study of general distribution.
All the X-ray scans that have been collected are for general-purpose diagnosis.
This means along with bone fracture scans there are also samples for chest diseases and abnormalities in the skull and spine region.
In the collected data the number of bone fracture samples in the chest, skull and spine region was sparse.
As a result, scans for the said parts were removed with the supervision of a medical officer.
This left us with 4,083 scans from the hand, leg, hip and shoulder regions.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
The FracAtlas dataset was accumulatively collected over 14,000 X-ray scans from several medical facilities across Bangladesh, with a substantial portion sourced from Lab-Aid Medical Center.
Following collection, a meticulous data cleaning phase was undertaken to ensure the integrity and usability of the scans.
Finally, the dataset was enhanced with detailed annotations.
Ethical approval was secured, ensuring the confidentiality of patient data, and all participants provided informed consent.
The collection process was designed to be non-intrusive to the standard diagnostic and treatment protocols of the involved hospitals.
### Annotations
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
The dataset includes 4,083 images that have been manually annotated for bone fracture classification, localization, and segmentation with the help of 2 expert radiologists.
Annotations have later been verified and merged by an orthopedist using the open-source labeling platform, makesense.ai.
There are 4 types of annotations provided in the Annotations folder:
1. Common Objects in Context (COCO) JSON: It contains a JSON file, which includes corresponding annotations for fractured images (total 717 images). It includes segmentation, bbox, and area for each fractured image. This is mainly used for segmentation. Notice that the COCO annatation annotation is only for images that have fractures.
2. PASCOL VOC: It contains xml files for each image. This is used for localization. For each xml file, it includes the height, width, depth, and segmented data for each image.
3. VGG JSON: It contains a single JSON file, which includes annotations for fractrued images.
4. YOLO: It contains txt files for each image. This is used for localization.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
All personally identifiable information in the gathered data has been removed, and theprocess was administered according to the Institutional Research Ethics Board of United International University.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
While the FracAtlas dataset is particularly valuable for the development of computer-aided diagnosis systems, its potential limitations should be carefully considered. Firstly, the manual annotation process, is susceptible to human error, which may result in mislabeling. Such inaccuracies can impact the performance of machine learning models trained on this data. For example, the authors claim that annotations (segmentation, area, bounding box) are provided only for fracture images, some non-fractured images also have annotations. Conversely, some fracturd images miss corresponding annotations.
It should be noted that to use the dataset correctly, one needs to have knowledge of medical and radiology fields to understand the results and make conclusions based on the dataset. It's also important to consider the possibility of labeling errors.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**APA:**
Abedeen, I., Rahman, M. A., Prottyasha, F. Z., Ahmed, T., Chowdhury, T. M., & Shatabda, S. (2023). FracAtlas: A Dataset for Fracture Classification, Localization and Segmentation of Musculoskeletal Radiographs. Scientific data, 10(1), 521. https://doi.org/10.1038/s41597-023-02432-4 |
kenanazam/channel_metadata | ---
dataset_info:
features:
- name: Channel ID
dtype: string
- name: Title
dtype: string
- name: Time Created
dtype: string
- name: Time Published
dtype: string
- name: Duration
dtype: string
- name: Description
dtype: string
- name: Category
dtype: string
splits:
- name: train
num_bytes: 33467
num_examples: 10
download_size: 27899
dataset_size: 33467
---
# Dataset Card for "channel_metadata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LexiconShiftInnovations/Continued_Pretrained_Dataset_Dental_Sinhala | ---
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 2293626
num_examples: 7286
download_size: 849600
dataset_size: 2293626
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bleroni/kosovo-government-contract-titles-with-classification | ---
license: mit
---
A dataset consisting of government contract titles in Kosovo, along with institution names, published date and category/classification. |
ChanceFocus/flare-cfa | ---
dataset_info:
features:
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: test
num_bytes: 727156
num_examples: 1032
download_size: 285343
dataset_size: 727156
---
# Dataset Card for "flare-cfa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rcds/swiss_court_view_generation | ---
task_categories:
- text-generation
language:
- de
- fr
- it
size_categories:
- 100K<n<1M
license: cc-by-sa-4.0
pretty_name: Swiss Court View Generation
---
# Dataset Card for Swiss Court View Generation
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Swiss Court View Generation is a multilingual, diachronic dataset of 404K Swiss Federal Supreme Court (FSCS) cases. This dataset is part of a challenging text generation task.
This dataset contains court views for different languages and court chambers. It includes information such as decision id, language, chamber, file name, url, and the number of tokens in the facts and considerations sections.
Main (L1) contains all the data, Origin (L2) contains only data with complete origin facts & origin considerations.
### Supported Tasks and Leaderboards
### Languages
Switzerland has four official languages with three languages German, French and Italian being represenated. The decisions are written by the judges and clerks in the language of the proceedings.
| Language | Subset | Number of Documents Main |Number of Documents Origin|
|------------|------------|--------------------------|--------------------------|
| German | **de** | 197K | 49 |
| French | **fr** | 163K | 221 |
| Italian | **it** | 44K | 0 |
## Dataset Structure
### Data Fields
```
decision_id (string)
facts (string)
considerations (string)
origin_facts (string)
origin_considerations (string)
law_area (string)
language (string)
year (int32)
court (string)
chamber (string)
canton (string)
region (string)
```
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
The original data are published from the Swiss Federal Supreme Court (https://www.bger.ch) in unprocessed formats (HTML). The documents were downloaded from the Entscheidsuche portal (https://entscheidsuche.ch) in HTML.
#### Who are the source language producers?
The decisions are written by the judges and clerks in the language of the proceedings.
### Annotations
#### Annotation process
#### Who are the annotators?
Metadata is published by the Swiss Federal Supreme Court (https://www.bger.ch).
### Personal and Sensitive Information
The dataset contains publicly available court decisions from the Swiss Federal Supreme Court. Personal or sensitive information has been anonymized by the court before publication according to the following guidelines: https://www.bger.ch/home/juridiction/anonymisierungsregeln.html.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
We release the data under CC-BY-4.0 which complies with the court licensing (https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf)
© Swiss Federal Supreme Court, 2002-2022
The copyright for the editorial content of this website and the consolidated texts, which is owned by the Swiss Federal Supreme Court, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made.
Source: https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf
### Citation Information
Please cite our [ArXiv-Preprint](https://arxiv.org/abs/2306.09237)
```
@misc{rasiah2023scale,
title={SCALE: Scaling up the Complexity for Advanced Language Model Evaluation},
author={Vishvaksenan Rasiah and Ronja Stern and Veton Matoshi and Matthias Stürmer and Ilias Chalkidis and Daniel E. Ho and Joel Niklaus},
year={2023},
eprint={2306.09237},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
|
yongchanskii/youtube-data-various-domain | ---
dataset_info:
features:
- name: source
dtype: string
- name: channelName
dtype: string
- name: category
dtype: string
- name: title
dtype: string
- name: videoId
dtype: string
- name: domainTag
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcriptionPath
dtype: string
- name: start
dtype: float64
- name: end
dtype: float64
- name: WER
dtype: float64
- name: CER
dtype: float64
- name: referenceText
dtype: string
- name: hypotheseText
dtype: string
- name: referenceTextLength
dtype: int64
- name: hypotheseTextLength
dtype: int64
splits:
- name: train
num_bytes: 559381442.864
num_examples: 2288
- name: test
num_bytes: 137840916.0
num_examples: 572
download_size: 691274587
dataset_size: 697222358.864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "youtube-data-various-domain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alejandro-FA/ma_ai_text_data | ---
dataset_info:
- config_name: eda_embedding_sample
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: words
sequence: string
- name: word_count
dtype: int64
- name: avg_word_len
dtype: float64
- name: __index_level_0__
dtype: int64
- name: embedding
sequence: float64
- name: embedding_tsne
dtype: float32
- name: embedding_tsne_1
dtype: float32
splits:
- name: train
num_bytes: 118500238
num_examples: 10000
download_size: 85299744
dataset_size: 118500238
- config_name: training
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1485162830
num_examples: 346977
- name: validation
num_bytes: 370060226
num_examples: 86587
download_size: 1712213816
dataset_size: 1855223056
configs:
- config_name: eda_embedding_sample
data_files:
- split: train
path: eda_embedding_sample/train-*
- config_name: training
data_files:
- split: train
path: training/train-*
- split: validation
path: training/validation-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_invariant_tag_fronted_isnt | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 160
num_examples: 1
- name: test
num_bytes: 190
num_examples: 1
download_size: 5813
dataset_size: 350
---
# Dataset Card for "MULTI_VALUE_stsb_invariant_tag_fronted_isnt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
recogna-nlp/FakeRecogna | ---
task_categories:
- text-classification
language:
- pt
tags:
- 'FakeRecogna '
- Fake News
- Portuguese
- Dataset
license: mit
size_categories:
- 10K<n<100K
---
# FakeRecogna
FakeRecogna is a dataset comprised of real and fake news. The real news is not directly linked to fake news and vice-versa, which could lead to a biased classification. The news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news. The web crawlers were developed based on each analyzed webpage, where the extracted information is first separated into categories and then grouped by dates. The plurality of news on several pages and the different writing styles provide the dataset with great diversity for natural language processing analysis and machine learning algorithms.
## The Dataset
The news collection was performed by crawlers developed for mining pages of well-known and of great national importance agency news. The fake news mining was mainly focused on pages mentioned by the [Duke Reporters Lab](https://reporterslab.org/fact-checking/), which provides a list of pages that verify the veracity of news worldwide.There were 160 active fact-checking agencies in the world in 2019 and Brazil figures as a growing ecosystem with currently 9 initiatives and there were considered 6 out of the 9 pages during search with a great variation in the number of fake news extracted from each one, ending in 5,951 samples. Table 1 presents the current initiatives as well as the number of fake news collected from each source.
| Fact-Check Agency | Web address | # News |
| ------------------ | ------------------------------------ | ------ |
| Boatos.org | https://boatos.org | 2,605 |
| Fato ou Fake | https://oglobo.globo.com/fato-ou-fake| 1,055 |
| E-farsas | https://www.e-farsas.com | 812 |
| UOL Confere | https://noticias.uol.com.br/confere | 582 |
| AFP Checamos | https://checamos.afp.com/afp-brasil | 509 |
| Projeto Comprova | https://checamos.afp.com/afp-brasil | 388 |
| Total | -------------------------------------| 5,951 |
Concerning the real news, the crawlers searched portals such as [G1](https://g1.globo.com/), [UOL](https://www.uol.com.br/) and [Extra](https://extra.globo.com/), which are publicly recognized as reliable news outlets, besides the [Ministry of Health of Brazil](https://www.gov.br/saude/pt-br) home page, resulting in a collection of over 100,000 samples. From this set, there were filtered out 5,951 samples to keep the balance between classes and, thus, resulting in a dataset comprised of 11,902 samples.
## More informations
The FakeRecogna dataset is available at GitHub as a single XLSX file that contains 8 columns for the metadata, and each row stands for a sample (real or fake news), as described in Table 2.
| Columns | Description |
| ------------------------ | ------------------------------------------ |
| Title | Title of article |
| Sub-title (if available) | Brief description of news |
| News | Information about the article |
| Category | News grouped according to your information |
| Author | Publication author |
| Date | Publication date |
| URL | Article web address |
| Class | 0 for fake news and 1 for real news |
The collected texts are distributed into six categories in relation to their main subjects: Brazil, Entertainment, Health, Politics, Science, and World. These categories are defined based on the journal sections where the news were extracted. The distribution of news by category and its percentages are described in Table 3.
| Category | # News | % |
| -------------- | ---------- | ------ |
| Brazil | 904 | 7.6 |
| Entertainment | 1,409 | 12.00 |
| Health | 4,456 | 37.4 |
| Politics | 3.951 | 33.1 |
| Science | 602 | 5.1 |
| World | 580 | 4.9 |
| Total | 11,902 | 100.00 |
# Citation
@aInProceedings{garcia2022fakerecogna,
author="Garcia, Gabriel L and Afonso, Luis CS and Papa, Jo{\~a}o P}",
title="Fakerecogna: A new brazilian corpus for fake news detection",
booktitle="International Conference on Computational Processing of the Portuguese Language",
year="2022",
publisher="Springer International Publishing",
address="Cham",
pages="57--67",
isbn="978-3-030-98305-5"} |
ImageIN/unlabelled_IA_with_snorkel_labels | ---
annotations_creators:
- machine-generated
language: []
language_creators: []
license:
- cc0-1.0
multilinguality: []
pretty_name: 'Historic book pages illustration weak annotations'
size_categories: []
source_datasets: []
tags:
- lam
- historic
- glam
- books
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
# Historic book pages illustration weak annotations |
dim/panorama_prompts_10k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 30478073
num_examples: 11024
download_size: 15784032
dataset_size: 30478073
---
# Dataset Card for "panorama_prompts_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vietgpt/open_subtitles_envi | ---
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 280063489
num_examples: 3505276
download_size: 176803145
dataset_size: 280063489
task_categories:
- translation
language:
- en
- vi
tags:
- LM
size_categories:
- 1M<n<10M
---
# OpenSubtitles
- Source: https://huggingface.co/datasets/open_subtitles
- Num examples: 3,505,276
- Language: English
```python
from datasets import load_dataset
load_dataset("tdtunlp/open_subtitles_envi")
```
- Format for Translation task
```python
def preprocess(
sample,
instruction_key="### Instruction:",
input_key="Input:",
response_key="<|endofprompt|>",
end_key="<|endoftext|>",
en2vi=True,
):
if en2vi:
if random.random() < 0.5:
instruction = "Translate the following sentences from English into Vietnamese."
else:
instruction = "Dịch các câu sau từ tiếng Anh sang tiếng Việt."
input = sample['en'].strip()
response = sample['vi'].strip()
else:
if random.random() < 0.5:
instruction = "Translate the following sentences from Vietnamese into English."
else:
instruction = "Dịch các câu sau từ tiếng Việt sang tiếng Anh."
input = sample['vi'].strip()
response = sample['en'].strip()
return {'text': """Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
{instruction_key}
{instruction}
{input_key}
{input}
{response_key}
{response}
{end_key}""".format(
instruction_key=instruction_key,
instruction=instruction,
input_key=input_key,
input=input,
response_key=response_key,
response=response,
end_key=end_key,
)}
"""
Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
### Instruction:
Dịch các câu sau từ tiếng Anh sang tiếng Việt.
Input:
Line up, I say!
<|endofprompt|>
Sắp hàng, nghe chưa!
<|endoftext|>
"""
``` |
Eduardovco/Rafa | ---
license: openrail
---
|
jxm/synthbio | ---
dataset_info:
features:
- name: serialized_attrs
dtype: string
- name: biographies
sequence: string
- name: notable_type
dtype: string
- name: attrs
struct:
- name: Bronze
dtype: string
- name: Gold
dtype: string
- name: Gold, 1984
dtype: string
- name: Gold, 1988
dtype: string
- name: Gold, 1992
dtype: string
- name: Gold, 1994
dtype: string
- name: Gold, 1996
dtype: string
- name: Gold, 1998
dtype: string
- name: Gold, 2002
dtype: string
- name: Gold, 2004
dtype: string
- name: Self-portrait of Toma Klima (2001)
dtype: string
- name: Silver, 2006
dtype: string
- name: Silver, 2007
dtype: string
- name: agency
dtype: string
- name: alias
dtype: string
- name: allegiance
dtype: string
- name: alma_mater
dtype: string
- name: associated_acts
dtype: string
- name: awards
dtype: string
- name: birth_date
dtype: string
- name: birth_name
dtype: string
- name: birth_place
dtype: string
- name: children
dtype: string
- name: citizenship
dtype: string
- name: coach
dtype: string
- name: codename
dtype: string
- name: collegeteam
dtype: string
- name: country
dtype: string
- name: criminal_penalty
dtype: string
- name: death_cause
dtype: string
- name: death_date
dtype: string
- name: death_place
dtype: string
- name: doctoral_advisor
dtype: string
- name: education
dtype: string
- name: elected
dtype: string
- name: event
dtype: string
- name: father
dtype: string
- name: fields
dtype: string
- name: final_ascent
dtype: string
- name: gender
dtype: string
- name: genre
dtype: string
- name: height
dtype: string
- name: hometown
dtype: string
- name: influenced
dtype: string
- name: influences
dtype: string
- name: institutions
dtype: string
- name: instrument
dtype: string
- name: known_for
dtype: string
- name: label
dtype: string
- name: language
dtype: string
- name: main_interests
dtype: string
- name: mother
dtype: string
- name: movement
dtype: string
- name: name
dtype: string
- name: national_team
dtype: string
- name: nationality
dtype: string
- name: notable_ascents
dtype: string
- name: notable_students
dtype: string
- name: notable_works
dtype: string
- name: occupation
dtype: string
- name: olympics
dtype: string
- name: operation
dtype: string
- name: paralympics
dtype: string
- name: partner
dtype: string
- name: partnerships
dtype: string
- name: position
dtype: string
- name: resting_place
dtype: string
- name: retired
dtype: string
- name: serviceyears
dtype: string
- name: sport
dtype: string
- name: start_age
dtype: string
- name: thesis_title
dtype: string
- name: thesis_year
dtype: string
- name: tradition_movement
dtype: string
- name: weight
dtype: string
- name: worlds
dtype: string
- name: years_active
dtype: string
splits:
- name: train
num_bytes: 5581070
num_examples: 2237
download_size: 2360383
dataset_size: 5581070
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "synthbio"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thundergb/Dipperbr | ---
license: openrail
---
|
eduagarcia/oab_exams | ---
dataset_info:
features:
- name: id
dtype: string
- name: question_number
dtype: int32
- name: exam_id
dtype: string
- name: exam_year
dtype: string
- name: question_type
dtype: string
- name: nullified
dtype: bool
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 2423759
num_examples: 2210
download_size: 1256596
dataset_size: 2423759
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
samwit/modified_WIESP2022-NER | ---
dataset_info:
features:
- name: text
dtype: string
- name: ner_outputs
sequence: string
- name: raw_tags
sequence: string
- name: ner_outputs_json
dtype: string
splits:
- name: train
num_bytes: 9332101
num_examples: 1753
download_size: 3045204
dataset_size: 9332101
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
argilla/websight-5K-multimodal | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for websight-5K-multimodal
This dataset has been created with [Argilla](https://docs.argilla.io).
It is a subset of 5000 records from the [Websight](https://huggingface.co/datasets/HuggingFaceM4/WebSight?row=0) collection, which is used for HTML/CSS code generation from an input image.
Below you can see a screenshot of the UI from where annotators can work comfortably.

As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/websight-5K-multimodal")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/websight-5K-multimodal")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| image | Image | text | True | True |
| html_code | Html_code | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| accuracy | Assess if the generated code accurately reflects the layout, design elements, and style shown in the image. | rating | True | N/A | [1, 2, 3, 4, 5, 6, 7] |
| quality | Assess the generated code for cleanliness, efficiency, and proper use of HTML/CSS practices. | multi_label_selection | True | N/A | ['clean code', 'efficient', 'proper tags and classes'] |
| correction | Identify any errors or issues in the generated HTML/CSS code and suggest possible corrections. | text | True | N/A | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"html_code": "```json\n\u003chtml\u003e\n\u003cstyle\u003e\nbody {\n font-family: Arial, sans-serif;\n margin: 0;\n padding: 0;\n border-box;\n}\n\nheader {\n background: #ff0;\n text-align: center;\n padding: 20px;\n}\n\n#video-container {\n width: 70%;\n margin: auto;\n text-align: center;\n}\n\n#video-player {\n width: 100%;\n height: 300px;\n background: #f0f;\n}\n\n#chef-tips {\n margin: 40px;\n}\n\n#chef-tips p {\n color: #000;\n line-height: 1.6;\n}\n\nfooter {\n background: #ff0;\n text-align: center;\n padding: 20px;\n position: fixed;\n width: 100%;\n bottom: 0;\n}\n\u003c/style\u003e\n\u003cbody\u003e\n \u003cheader\u003e\n \u003ch1\u003eInter\u0027national\u003c/h1\u003e\n \u003c/header\u003e\n \u003cmain\u003e\n \u003csection id=\"video-container\"\u003e\n \u003cvideo id=\"video-player\" src=\"#\" controls\u003e\u003c/video\u003e\n \u003c/section\u003e\n \u003csection id=\"chef-tips\"\u003e\n \u003ch2\u003eChef\u0027s Tips\u003c/h2\u003e\n \u003cp\u003eEnjoy the tasty and healthy recipes shared by the best internationally recognized chefs. Discover the latest cooking trends and techniques.\u003c/p\u003e\n \u003c/section\u003e\n \u003c/main\u003e\n \u003cfooter\u003e\n \u003cp\u003e\u00a9 2022 Inter\u0027national. All rights reserved.\u003c/p\u003e\n \u003c/footer\u003e\n \u003c/body\u003e\n\u003c/html\u003e\n```",
"image": "\u003cimg src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABQAAAALQCAYAAADPfd1WAAB/xElEQVR4nOzdeZxWdd0//vc1DCDgIIsOLlgJBoqJicKN4ZalpoXeZlpf03JJU2/3cskyl8rd8q4009JSbzUx09xSzCUjd1NTFPcCFZBlYAaGZWbO74/zu66ZaxbmAmYYOPN8Ph7X48xZrnM+51xnua7XfM755JIkkgAAAAAAMqmsqwsAAAAAAHQeASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGlXd1AQCA7uPVVyM++qh42NixEf36dU15utqiRRHPPls8rF+/dJtkxVtvRcyY0dify0VMmBBR3s2+hT73XERNTWN/RUXEDjt0XXkAgO4llySRdHUhAIDu4QtfiHjwweJhTz4ZMX78mln+lCkRf/xjxE9/umaW156nnorYaafiYZWVEbNmdU15VkVDQ8TNN0fU1kZ8+9stxx9zTMR11xUPmzEjYrPN1kz51habb14chA4bFvH2211XHgCge3ELMACQedOnR3z96xE779wygGTVPfVUGt5+85sR06Z1dWkAAGhLN7v5AgDobi69NOLMM7u6FNlSXR1x4okRv/99V5cEAIBSqAEIAGTaj3/c1SXInunThX8AAOsSNQABALrIkCEth33842u+HJ3p4osjzjijsT+Xi9h0064rDwBAdyQABADWKfPnRzz/fNqK6qc+1bktCH/4YdpycZ8+EaNGRQwcuOrzWr48bfG3tjZixIi0UYhPfCJdj+rqxulGj165+dbWRvzznxFLl0Zsu23Ehhuu3Hs//DBi5syIefPSbTloUMQmm6SNkXSEQYPS18qaPz/iP/+J+OCDiA02iNh669Xb/nn19RGvvZbWYtx66zRwzeVKf//s2en2mjkzndfAgelr2LCInj1Xv3wAAJ3BLcAAwFpl+PA0kMm/LrssIknS7jbbpGHSnnumjU+sv37EuHFpANbckCHp+5uGaxERU6c2znvnnVu+b9GiiFNPTd+/6abpsnbeOV3u5pun4xYtar3sCxcWlz2Xi7j77oirr47o1StiwoSIz38+4mMfS1tErq9v2QLyNtsU9x93XPH8vvSldPh996Xl6ts3ne8ee0RstFFaxttuW/E2njYt4vjj0/cOH56+f+LEdB6f/nS67jvs0HI+r7+elqF5GSMifvazxjKec07j8FNPbblN2mrlOEnSVpq/8IV0e3/60xH77puWL7/9v/WtiI8+anvd/u//ipfVv39EXV26zvvtl4Z1226bzneLLdJw8fvfTwPattTVRfzudxHbbZdum+22i9h773QeO+0UsdVWEYMHp2X78MO25wMA0FUEgADAWqWmprh/3ryIb3wjvY106tSW0z/7bMSYMRGTJhUPr61tf1nz5xf3//Of6byuvDKt6dXcjBnpuDFjIl56qeX4+vqWw37724j/+Z+Wwz/6KKK8PA3amtp66+L+5mHjwoURv/hFGgROmdJ6Gf/f/4s47bSW4yLS2pNbbRXxq1+1Pj7vhRfS+Zx6auOwZctW/J7WytxaWFpX13LY/PkRBx4Y8ZWvtN1S84wZ6fYcPjzizjtbn2bp0uL+6uqIxx6LGDs24p57WgbC1dURF14Yseuure8z9fURRx4ZccQRES+/3Poy8/P57W8jRo6MeOuttqcDAOgKAkAAYK128cURN9/c/nQnnFB6QNWaWbMidtst4o032p/2jTfS2mkzZrQ/7T33tD78iCPS7imnRDz9dONr111XPL8nnog46aT2l/uzn7UMTBcujPjsZ9t/b1NXXpmGgZ1p2bK0Rt2f/lTa9NXVaVj461+XNv2ee7YM/pp76qnW97PLL4+46abSlpMv23e/W/r0AABrggAQAFhnfO1rEeeem9bAa2727OKaY5tv3vZz7Cor01fTBjfOOadlSFRREXH22WkNsdaezXfJJSu/Dnlf/nLaHTIkvY05/+rbt/R57LFHxPnnR3zxi62Pv/764v6bbmq5jpWV6e3Av/xlxJlnpuvc3OTJabdPn3T61qbJz6uyMr0ddmWcd15ak7O5iop03XbZpfX3HXts+pzAUo0enX6eRx7Z+vhf/rK4f8mSiJ/8pOV0e+8dcdFFET/9aRouNnf33asXRgMAdDSNgAAAa72KirThhs02S/vPPDN9dl7zWzKnTUufZReRNt4RkT4DrmnoNWpU47i8f/4z4rrriocNG5beYrvxxmn/aadFHHRQcY2+X/4y4jvfSRvzWJHKyojf/Ca93ffhhyPefXf1W8J94IH0WXl5xx0Xcc01xdM0rwH49NMt5/O3v6W3reZttVVj7cS8999Pu5/8ZFpTcurUls8BPPXUNBBbWXPmpGFac4cfHnHVVY2B6LRpaWjafJ2OPTbi/vvbX87ZZxeHeXvumd7i3NTLL0c0NESUlTX29+lTvP8ceWR6q2/eySenjaY0v2V83rzGfQcAoKupAQgArPVuvrkx/ItIQ5nTT2853cKFqzb/Rx5pOezCC4sDnN69I37+85bT3Xpr+/P/3/9Ng8lNN02fZ3juuatWzrzTTisO/yLSULS55s84vPHGtLGL119PG9t47LHi8C8iYvfdW86nqmo1CtuO1hosGTs2Ddma1oYcOTKtWdfcAw+0/rzG5vM7//ziYV/7WsTQoS2nXbKk8e9x49LAc+7c9Nbrm25KbwluqqwsYv/9W86nvVuOAQDWJDUAAYC1Xmu3WQ4b1nJY8wYgStXac/+GD09rpzW1/vppbcSm4c706e3Pf7/9Vq1cbWntlt+PfazlsNZuQy0vT8O0fPC3YEH6jL/nnot4/PG0deHmmoZiHe2xx1oOO//8xlp4TW25ZcRRRxXXwItIa4e2dbt3RLq9ylv51rv11i2f47h0acvbsAcNSltc3nnntIbgtGnp9nrmmYh77414552W8y6lERoAgDVFAAgArNUqK9Maf821NmxVvfZay2Fjx5b23vYaAhk6dOWe61eKzTdvOay1wKwt06alNe/uvHPFLdvm5XKlz3tlvflmy2Hbbtv29Ntv33LYK6+kDbi0pbVwNKL0z6W2Ng1Gb7st4qGHSqvd16NHafMGAFgTBIAAwFpt/fVbH96rV8ctY9q0VX/vv/+94vFbbLHq825LW+Fn89qJrbnhhrYbwWhLa7XnOkprtQuHDGl7+taeq9deQyBtba/evVf8voiIjz5KaxC21kjJiqxMIAsA0Nl8NQEA1mrrrdf68I6slbY6jTXU1Kx4fEfX/osoLbhqzdVXtx3+7blnxAUXtH5LbmeGWYMGtRy2omf6zZzZclh7IWtb+1B76zV/ftrYTGvh39ChEUcfnQaqxx238vMGAFiT1AAEALq9kSNb3go7a9aKnytXqrbCp9WxquHn//5vy2HXXRdxyCGNQWV9fcctrxSbbx7x1FPFw159tbjRl6Zau2W5eYvEza1q+R9+uOXz/XbfPW2deNSoxmEffNDyvQJAAGBt4qsJANCt1NW1HNZagyJ/+1vLYe+9F/Hd76YB0F//GvH++xFJsuLldeStyqvjP/9p2djJMcdEfOtbxbUU3323/Xm1Fm4tX75q5Ro3ruWwc89tfbu++27Etde2HL711qu27PY8/HDLYTffXBz+RbT+HEMAgLWJABAAyLTmz3/78MPG1oLzreR+6Ust33fqqenz35r67ncjrrgi4oQTIj7/+fQ20BNOWPHy15bGIFp7VmHzmncNDREnndRyuuatK7f2TMCmrSG31vpwW7761dbLdcIJxc8HfOediK98peW0++8fseGGpS9vZbz1VsthL7xQ3P+Pf0T87nctp1uZbQAA0NkEgABApg0YUNxfXR0xenTETjs11hzbeeeIXXYpnm7GjIhPfSoN/G64IeLLX4744x9bzv9//qdTit3hRo5sOezllyOOOiri0Ucjfvvb9PbWBx5oOd38+cX9/fq1nObuu9PtOHJk+6FoU5tvHnHYYS2HX311xMc/HnHwwRFf+ELE8OEtw7eKirQ2ZmcZPbrlsOOPT2shPvpoxGmnpWVrTVVV55ULAGBleQYgAJBpn/50y1tfm/bX16e19C6+OGLChOLpZs9Oa/215fjjW94OuraqrIwYM6ZliHb99elrRV56Ka0dmL/1d6ONWm9xeMqUtLuyLR//8pfpe5s/b2/27IhJk9p+3//+b9vPCuwIe+0VceWVxcNmzIj49rfbf++0aS33JwCArqIGIACQaV//+orH52/z/cxn0qBr6NDS5nvYYa03qrE2u+mm9qepqGg9CG16u3B5edoCbltKeY5gU/37Rzz+eNoScSkqKiL+9KeII45YueWsrH32iTj88Pan23//lsNuu63DiwMAsMoEgADAGtPas+Pae0ZeW63otva+1oZNnJjeTtqaESOKnzM3enTEiy+moU9rDYNEpLXobrstvS24+fq0tvyePVufT6lKXc+Ils87bD7dqFFpC7t77NH6+7/2tfS5d60Fm3/5S3H/j34Uceyxrc+naYha6mc+dGi6jJtuSm9Fbk1FRfrZvP12xH//d+vTtLa81oZFtP7ZNC/btddG/Oxnrb9/6NCI22+PuOuudD9ravLkiJqaxv7mjcG0VSYAgM6QS5Jop+06AIB1X21txOuvp7eZfvzjaRjWtPXb1lRVRUydmjYcstlm6fPqNt00IpdbI0XuVB98kG6L2bPTIHSrrVYtlJo/P91Gs2en8/jkJzsm3Jo1K211+YMP0hqC22wTsfHGqz/fVVVbm26vd95J95vttuu8xkcAADqaABAAAAAAMswtwAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGlUckXV0G6DI77rhjVxcBAACANeC5557r6iJAl1EDEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMPKu7oAAACsvoaGhsjlcrF8+fJYunRpNDQ0RE1NTdTW1kZdXV1ERNTV1UV9fX306NEjysvLo2/fvtHQ0FAYVlFREWVlZdGvX78oLy+PsjL/KwYAyAIBIADAOiwf6i1cuDBqamoKAeDy5csjSZI237N06dJYvHhxYZqysrKYP39+lJeXR5IksfHGG8eGG24oBAQAyAABIADAOmjRokVRX18fNTU1UVVVFXV1dbF8+fKiacrL0696ZWVlkcvlokePHpHL5SJJkqivr48kSaKhoSHKyspi2bJlERGF7qJFi2LgwIECQACADBAAAgCsQ5YuXRrV1dVRVVUV1dXVkcvlor6+vjC+X79+kcvlory8PNZff/1Yf/31o6GhIXr06BF9+vSJ5cuXR0NDQyRJEkuXLo0ePXpEWVlZVFVVxfLly6OqqirWX3/9GDBgQPTs2bML1xQAgI4iAAQAWMslSRK5XC4++OCDWLx4cSxcuLAwvGfPntGzZ8/o1atXDB48ONZbb73o0aNH9OzZM5IkiR49ehTNq1evXoW/+/TpU/i7X79+UVdXF5tssknU1tbGgAED1si6AQDQ+QSAAABrsSRJYuHChVFVVRVz5swpDO/du3f06dMnBgwYEOuvv35h2OooLy+P8vLy1Z4PAABrFwEgAMBaqKGhIRoaGmL27Nkxb968WLp0aURE5HK5GDBgQAwePDgqKioiIjynDwCAFRIAAgCsZRoaGqK2tjZmzpwZixYtKjTuscEGG0SfPn1i4MCBsd566wn+AAAoiQAQAGAtk3/WX3V1dURE9OzZM/r37x+VlZXRt2/fLi4dAADrGgEgAMBaZObMmbFgwYJYsmRJRESst956seGGG8bgwYNbNOgBAAClEAACAKwl5syZE/PmzSuEfxtttFEMGDAg+vfv38UlAwBgXSYABABYC8yePTs++OCDaGhoiIiIgQMHxoYbbhh9+vTp4pIBALCuEwACAHSx+fPnx/Tp0wv9+ef9rbfeepHL5bqwZAAAZIEAEACgC9XW1sYHH3xQ6K+oqIgtt9wykiTRyi8AAB1CAAgA0EXq6uriww8/jPr6+ohIG/zYYostIpfLqfkHAECH8W9lAIAukCRJVFVVxdKlS2P58uVRVlYWW2yxRfTs2bOriwYAQMYIAAEAusD8+fNj/vz5sXjx4oiI2GSTTaJXr15dXCoAALJIAAgAsIYtXLgwqqqqYuHChRERMXjw4BgwYECUl3s6CwAAHU8ACACwBjU0NERtbW3Mnz8/ItJGPyoqKmK99dbr4pIBAJBVAkAAgDWorq4uZs6cGRERvXr1ioEDB8bAgQO7uFQAAGSZABAAYA1JkiQWLFhQaOG3vr4++vfvH2VlvpIBANB5fNsEAFhDli5dGlVVVbF8+fKISBv+0OovAACdTQAIALCGLFmypNDwR0VFRfTt21ftPwAAOp2m5gAA1oClS5fG4sWLC4Ff3759o6KiootLBQBAd+BfzgAAa8CyZcti/vz50dDQED179oxNNtmkq4sEAEA3IQAEAFgDqqurY8mSJRER0adPn6ivr+/iEgEA0F0IAAEAOtny5ctj2bJlhf4+ffpEr169urBEAAB0JwJAAIBOliRJofGPXr16RZ8+fbq4RAAAdCcCQACATrZo0aLI5XIREVFfXx99+/bt4hIBANCdCAABADpZjx49IkmSiIgYMGBA9OjRo4tLBABAdyIABADoZAsXLoyGhoaIiOjZs2eUl5d3cYkAAOhOBIAAAJ1s6dKlUV9fH7169YoBAwZ0dXEAAOhmBIAAAJ0sf8tvQ0NDoSYgAACsKQJAAIBOVlZWFrlcLsrKyqK+vr6riwMAQDcjAAQA6GT19fWFRkB69+7dxaUBAKC7EQACAHSyurq6iIhYtmyZW4ABAFjjBIAAAJ1s8eLFERHRt2/f6NmzZxeXBgCA7kYACADQyfKNgESEABAAgDVOAAgA0MnKy8sjIn0WYG1tbReXBgCA7kYACADQyXr16hUREUmSxPLly7u4NAAAdDcCQACATpbL5SIibQxkyZIlGgIBAGCNEgACAHSyvn37Ro8ePaKhoSGWL18eSZJ0dZEAAOhGBIAAAJ1s/fXXL9QCXLJkSeFvAABYEwSAAACdrKGhIcrK0q9d9fX1ngMIAMAaJQAEAOhkvXv3jt69e0dE2hBIfX19F5cIAIDuRAAIANDJevbsGT179oyIiOXLl8eSJUu6uEQAAHQnAkAAgE6Wy+Wib9++UV5eHsuWLYuqqiohIAAAa4wAEABgDRgwYED06dMnIiKqq6tj2bJlWgMGAGCNEAACAKwBPXv2jAEDBkQul4v6+vpYsGCB1oABAFgjBIAAAGtAWVlZ9O7dO3r16hVJkkRtbW3U1dV1dbEAAOgGBIAAAGvIeuutF2Vl6devJUuWRHV1dReXCACA7kAACACwhvTs2TP69+8fEWlrwLNmzVILEACATicABABYQ8rKymLAgAHRr1+/iEhrAc6ZMyeWL1/exSUDACDLBIAAAGtQnz59YuDAgVFeXh719fUxd+7cqKmp0SIwAACdRgAIALAG9ejRIwYOHBh9+vSJiLQW4IIFC9wKDABApxEAAgCsYT179oyNN944+vbtGxERc+fOjY8++kgtQAAAOoUAEABgDcvlctG/f//YaKONory8PCIiqqqqoqampotLBgBAFgkAAQC6SP/+/WODDTaIsrKyWLp0acyePVsICABAhxMAAgB0kV69esWgQYOiV69e0dDQEFVVVTFz5sxYunRpVxdtleVbNl60aJFbmgEA1hICQACALrT++uvHZpttVmgUZMGCBTFt2rRYsmRJF5ds5dXX18cHH3wQH374YcydOzfq6+u7ukgAAIQAEACgS5WVlcWAAQNis802i4qKisjlclFfXx+zZs2K2trari5eyRYvXhwffPBBVFVVxfLly6OmpmadrskIAJAl5V1dAAAA0pqA9fX1UVdXF7W1tTF//vyIiBg8eHD07t07evbs2cUlbF2SJFFTUxOzZ8+OqqqqiIioqKiIDTfcMPr169e1hQMAICIEgAAAa4UePXrEoEGDonfv3jFjxoxYtGhRzJkzJ6qqqmKjjTaKDTfcMHr16tXVxSxIkiTq6urio48+iqqqqkJtvz59+sSQIUNi/fXX7+ISAgCQJwAEAFiL9OvXLzbddNOYNWtWVFdXR11dXcydOzd69OgR/fr1i379+kUul+vqYkZtbW3MmTMnPvroo4hIb2Veb731YtNNN40NNtigi0sHAEBTAkAAgLVMv379YqONNor6+vqoqamJZcuWxYwZM6J3796x0UYbxfrrr99lt9fW1dXFvHnzYsGCBbF48eKIiCgvL48NNtggBg0apOYfAMBaSAAIALCWKSsriw022CA22GCDmDt3bsyePTuWLFkSy5Yti9mzZ8eCBQti0KBB0a9fvygvL4/y8vJOqxWYJEnkcrmoq6uL6urqmDVrVixZsqSohd9NNtkkBgwYsFbdogwAQCMBIADAWmyDDTaIhoaGqK6ujgULFsSyZcti2bJlsXjx4sItwfmGQjpaPnSsra2NRYsWRU1NTSxfvjwi0mcWrrfeerHJJpu45RcAYC0nAAQAWIuVl5fHRhttFIMHD465c+dGTU1NVFdXx/Lly2PhwoWxcOHC+PDDD2PAgAGFWoM9e/aMurq6yOVy0aNHj3aXka/llyRJLFmyJHr16hULFiyIqqqqqK6ujoaGhkiSJJIkiYj0FuWKiooYMGBA9O3bt7M3AQAAq0kACACwDigrKyvc9tuvX79YtGhRLFq0KOrq6qK+vj6qqqqiqqoqIiLWX3/96NOnT/Tr1y+SJImePXtGWVlZVFRURE1NTZSVlUWvXr1i6dKlsWTJkoiIqK6ujvr6+qirq4uampro1atXLFu2rKgMvXr1igEDBsTAgQOjT58+JYWLAAB0PQEgAMA6okePHtG3b9/o27dvLF26NGpqaqKmpiYWLlxYFNblh3/00UdRXl4evXv3jkWLFkWPHj2id+/e0bt371i4cGGUl5cXAsR8DcC8ZcuWRXl5efTp0yc22GCDyOVy0bt376ioqIiysrKuWH0AAFaRABAAYB2UD/IGDx4cS5cujQULFhSeEZhvtKPpKyKivr4+Fi9eXGi9t2lDHvnwr2fPntGzZ88oLy+P/v37x4ABAwo1CAEAWDcJAAEA1nG9e/eOysrKqKysjPr6+kKDHUmSFBrtyN/Wmw8Hc7lc9OrVK+rq6qKhoSH69+8fZWVlhVp+y5Yt83w/AICMEAACAGRIjx49on///tG/f//Vmk95ua+JAABZ4V4OAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAKRbS5JEv379+vXr169fv379+vV3g37oznKJI4JubIcddujqIgAAALAGPP/8811dBOgyagDS7eVyOV1dXV1dXV1dXV1dXd1u0IXuSg1AujU1AAEAALoHNQDpztQABAAAAIAMEwACAAAAQIYJAOnWuroVKv369evXr1+/fv369evXv2b6oTvzDEC6tTFjxnR1EQAAAFgDXnjhha4uAnQZNQABAAAAIMMEgAAAAACQYQJAAAAAAMiw8q4uAHQlj8AEAAAAsk4NQAAAAADIMDUA6dbUAAQAAACyTg1AAAAAAMgwASAAAAAAZJgAkG4tfwuwrq6urq6urq6urq6ubra70J3lEkcC3dh2223X1UUAAABgDXjppZe6ugjQZdQABAAAAIAM0wow3ZoKsAAAAEDWqQEIAAAAABmmBiDdmhqAAAAAQNapAQgAAAAAGaYGIN2aGoAAAABA1qkBCAAAAAAZpgYg3ZoagAAAAEDWqQEIAAAAABmmBiDdmhqAAAAAQNYJAOnWmgaAuVxOv379+vXr169fv379+vVntB+6s1zS9IiAbmbrrbfu6iIAAACwBrz22mtdXQToMmoA0q3JvwEAAICsEwDSrQkAAQAAgKzTCjAAAAAAZJgagHRragACAAAAWacGIAAAAABkmBqAdGtqAAIAAABZJwCkWxMAAgAAAFknAKRbEwACAAAAWecZgAAAAACQYWoA0q2pAQgAAABknQCQbk0ACAAAAGSdAJBuTQAIAAAAZJ1nAAIAAABAhqkBSLemBiAAAACQdQJAujUBIAAAAJB1AkC6NQEgAAAAkHWeAQgAAAAAGaYGIN2aGoAAAABA1gkA6dYEgAAAAEDWCQDp1gSAAAAAQNZ5BiDdWvMAUL9+/fr169evX79+/d2zf+DAgZEkSQwcODD23HPPGD58+FpVviz39+jRI7beeuvYaqutokePHp22POjOcokjgm5sk0026dD5DRkyJCIiZs2a1aHzBQAAOs+gQYNixx13jIceeigGDRoUw4YNi0GDBsW8efPiueee6+ritTB8+PAYOHBgof+dd96JefPmRUQUyp83f/78ePvtt9d4GVfG1ltvHZtvvnlEREyfPj1ee+21TlnOhx9+2CnzhXWBW4Dp1jo6/06SJLbbbrv48MMP44033oilS5d26Pxb86lPfSp+/vOfx4gRI2LZsmXx1FNPxWGHHRb19fWtTv/Vr341Tj/99Nh0001jwYIF8fvf/z4uvvjiwvif/OQn8eUvfzkqKirio48+itNPPz0efvjhTl8PAADoCoMGDYoddtghkiSJJEli7ty5MXfu3Bg+fHgMHz489txzz3jooYe6uphFBgwYEAMHDiyEfvmyN/970KBBRf1rq7bKD3QcASDdWmcEgEmSxMYbbxwbbbRRvPHGGzF9+vQOXUZT/fr1i7vuuit69OgRF198cWyyySZx5JFHxj333BP77LNPi+l33HHHuPLKK2P69Onxwx/+MPbaa6845ZRTYtmyZXHFFVfEd7/73TjqqKPi0Ucfjb/+9a9x3HHHxU033RRf+MIX4sUXX+y09QAAgK7QNPyLKP598NZbb0WSJDF8+PAYNmzYWleLbt68efHss8+2GJ4PMCMixo4dGxFr/62w06ZNK5TxjTfeWOvLC+sizwCkW2t+oV/dblP551iMGTMm1l9//U5Z3qGHHhrrr79+nHbaafGLX/wivve978U///nPGDNmTAwYMCCSJIk777wz7rnnnoiIOPvssyMiYpdddonrrrsuvvKVr0RtbW1861vfioiI4447Lj766KP46le/Gtddd11MnDgxcrlcfPe73+2U8uvq6urq6urq6uquye7AgQMLgd7w4cNjxx13jKaSJCm6hfatt96K+fPnx/Dhw9eK8me1W1dXF6+99lpMnTo16uvrO2050J2pAUi3liRJh15Ums4vb/DgwbHTTjvFW2+9FW+//XaHLu9Tn/pURET86U9/ioaGhohI/3s2ZsyYGD16dDz22GMxevTo6NGjRzQ0NMTHPvaxmDt3bixevLhQvo8++iiGDh0aDQ0N0a9fv/j73/9eWI/p06dHkiQxcuTIwvy7+suBrq6urq6urq6u7qp2Bw4cWPR8vPy4pv0NDQ0xfPjwmDdvXsybNy/efPPNGDt2bAwYMCDmzZu3VqzHW2+9Vfi7I6br6m6PHj1i5MiREZH+nmloaOi05UF3JQCENWTLLbcs3Bacf1ZHR8yzoaGh6Hl/r776akREbLbZZhERse2220ZZWVrZd8MNN2zx4Nv//Oc/8bGPfSw+8YlPRC6Xa9GASU1NTaEGIwAAdBf5EHBtVGq51tbyNzdy5MhCIyAREVOnTu3C0kA2CQDp1jr6v0BN/7O2ovEdtdz11luvMN+85cuXR0TE+++/H0mSRE1NTWFceXl59OjRo2j6+vr6aGhoiL59+xamaT6+urraf8wAAFjnrcz39Xxtv/xr4MCBhWfr0bGa11D02wM6nmcA0q2tyWrlb731Vjz55JMdetvAnDlzoqysLAYOHFgYPmTIkIiI+Mc//tFi+sWLF8fgwYOLhg8aNCiqqqrinXfeiYiILbbYomh8375948033+zU7aWrq6urq6urq6u7JrttaT4+SdLvyxGx1tz+G5E+yzDfwm9HTNfV3ddffz2mT58e//nPf4oaBFlTnzt0B2oA0q0lSec/A3DOnDkxbdq0qK6u7rDl5LvPPvts7LrrrvG5z30ubr/99oiI+NznPhe1tbVRW1vbYvp///vfse2220ZFRUUsXLgwIiJGjBgRr776aixatCiWLVsWW265ZWE9Ro4cGb169Yrnn3/eMwB1dXV1dXV1dXXX+e7cuXML/RHpI3WaSpL0GYBvvvlm4fl5+efRzZkzp8vLn+/mGyWZM2dOh0zX1d2GhoZ45ZVXoqnOWh50V2oAQiepq6uLqVOnxnPPPVcI/zraVVddFUmSxOWXXx6f+9zn4jvf+U5st9128eyzzxamueeee+LBBx+MiIirr746crlc3H///fFf//Vfccstt0Tv3r3jjjvuiIiIyZMnR2VlZfz85z+P//qv/4pJkyZFkqQtCQMAwLpu3rx58dZbbxVezzzzTJvTRKQB4ZZbblnop3P06NEjRo0aFaNGjYoePXp0dXEgk3KJGJxuLH/rbEfZeOON49Of/nR88MEHMW3atFi6dGmHzr81X/3qV+NXv/pV5HK5iIj48MMPY/vtty8s+7333ouePXsWGgW59tpr46CDDiq8/6GHHoqvfvWrERHRs2fPeOqppwotoyVJEt///vfjV7/6VaevBwAAdIVBgwbFuHHjIiLiL3/5S2F4PvxrPnxtMG7cuBg0aFChkY+33nqr8PegQYMK5c5P01rQuTYZNWpUfOxjH4uItJHCzmoEZP78+Z0yX1gXCADp1gYMGFAIziLSwGt1+ocMGRK5XC5mzpzZIfMrtb93796x1157xdSpU+Ptt99ud/rBgwfH7rvvHo888kjRRTA/fsstt4xRo0bF/fffH3V1dZ1efv369evXr1+/fv36u7I/HwI+8MADkcvlCgHb3Llzi+6uWVvKu+WWWxae7ZfL5QoBYJIkMXjw4EIAmCRJzJ8/v1CDcW0pf/P+fACYJElMnz69EAB29PIEgHRnAkC6tQEDBnR1EQAAgLVAvrZcPgxcF2rOZUWPHj1iq622ioi0QZD6+vpOWU5VVVWnzBfWBQJAurUNNtigq4sAAADAGrBgwYKuLgJ0Ga0A063JvwEAAICs0wowAAAAAGSYGoB0a2oAAgAAAFmnBiDdWj4A1NXV1dXV1dXV1dXV1c12F7ozjYDQra2//vpdXQQAAADWgJqamq4uAnQZNQDp1rr6P1C6urq6urq6urq6urq6a6YL3ZkagHRr/fr16+oiAAAAsAYsWrSoq4sAXUYjIHRr8m8AAAAg6wSAdGsCQAAAACDrPAMQAAAAADJMDUC6NTUAAQAAgKxTAxAAAAAAMkwNQLo1NQABAACArFMDEAAAAAAyTA1AujU1AAEAAICsEwDCGiJsBAAAKJbL5bq6CNAtCABhNWy++eax1VZbRa9evbq6KAAAAJm0bNmyeP3112P69OldXRRYZ3kGIN1ae7Xy2hsv/AMAAOhcvXr1iq222mq1f79Bd6YGIN2eiwgAAMC6we83WDVqAMJqeO2112LZsmVdXQwAAIDMWrZsWbz22mtdXQxYp+US8TjdWHm5SrAAAADdQV1dXVcXAbqMGoAAAAAAkGECQAAAAADIMAEg3Vr+DnhdXV1dXV1dXV1dXV3dbHehO/MMQLq1Hj16dHURAAAAWAPq6+u7ugjQZdQABAAAAIAMEwACAAAAQIYJAOnW2ns2hPHGG2+88cYbb7zxxhtvvPHZGA/dmWcA0q2VlcnAAQAAuoOGhoauLgJ0GekHAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhpV3dQGgK2kEGwAAAMg6NQDp9nK5nK6urq6urq6urq6urm436EJ3lUtUgaIbcxEAAADoHsQfdGdqANLtdfV/oHR1dXV1dXV1dXV1dXXXTBe6KzUA6dZcBAAAALoH8QfdmRqAAAAAAJBhWgGmW/MfIAAAACDr1AAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDyru6AKuipqYmnn766XjllVfixRdfjAULFsTWW28dW2+9deyxxx6x6aabtnhPVVVV/N///V+h//Of/3yMHDmy08s6e/bsuPfee+PZZ5+NDz/8MLbaaqsYMWJE9O3bN+bOnVuY7tvf/naUl3fsx/H444/HK6+8stLv22+//aJ///5x8803F4Z97nOfi6222qojiwcAAADAGpBLkiTp6kKsjMcffzwOPfTQmDFjRpvT/PznP4/jjjuuKFB7/fXXY+utty7033TTTXHooYd2aln//e9/xx577BHvvPNOi3E77rhjPPfcc4X+mpqa6NevX4cu//jjj49f/epXK/2+Bx98MLbYYosYMWJEYdjvf//7+MY3vtGRxQMAAABgDVinbgH+/ve/H7vvvvsKw7+IiJNOOim+8pWvRENDwxoqWetuvvnmVsO/XXbZJXK5XBeUCAAAAIDuZp25Bfjhhx+OCy+8sGjYqFGjYrfddovZs2fHH//4x6Jxd999d1x77bVx7LHHrsliFnnzzTeL+h955JHYYostor6+vkXtw84IBEePHh0TJ04sGvbqq68WhZITJkyIQYMGFU2z0UYbRe/evYtqAG6wwQYdXj4AAAAAOt86cQvw8uXL41Of+lS88cYbhWE//vGP46yzzooePXpERER1dXUcf/zxRc+tGzZsWLz55ptRVlbWJbcAf/azn43HHnssItKw8tVXXy2MO/zww+P3v/99RERUVFTEwoULO7UseT/60Y/ihz/8YaH/+eefjzFjxqyRZQMAAACw5q0TtwDfeuutReHfxIkT4/vf/34h/ItIQ7QrrrgiKioqCv3bbrttUUMbzf3jH/+Ir3/96zFkyJDo379/fOlLX4o//OEPbU4/derUOOqoo2KbbbaJXC4X22yzTRxxxBExZcqUoumuu+662G+//eL5558vDJs+fXrst99+sd9++8W0adNihx12KIwbP3580ftnzpwZZ555ZowbNy769+8fuVwuhgwZEjvvvHP86le/ikWLFrWzxVbfhx9+WCjvfvvtF48++mhh3B133FEYftppp8XSpUvj8ssvj8997nORy+Viu+22i5NOOqnVbZ8kSdx7772x7777xvDhwyOXy0Uul4vhw4fHd7/73fjnP//Z6esGAAAA0J2sE7cANw+Fzj777Fanq6ysjEceeSR69+4do0aNKgoIm/v5z38ezz77bNGw++67L+67776YNm1aUS25iIjrr78+jjrqqKJhU6dOjalTp8bvfve7uOCCC+IHP/hB5HK5+Ne//hX33HNP0bTV1dWFYWeddVaMGTOmEFY2DQBfeeWV2HbbbVuUd/bs2TF79uyYMmVKPP300/G73/2uzXXrCDU1NUXr8JWvfKXw95tvvlkYN3To0Jg6dWo8+OCDhfEvv/xyvPzyy/GHP/whHnroodhuu+0iIg3/vvnNb8ZNN93UYnnvvPNOXHHFFXHFFVfEq6++GqNGjeqsVQMAAADoVtaJGoDTpk0r6m8tIMvbcccdY9ttt11h+BcRhfCvoqKiEMTlnXvuuTFnzpxC/8svv9wi/Gvuhz/8Yfz5z39e4TRNTZgwIRYuXBgLFy6MCy64oDD8yCOPLJpuzJgxsc8++xSV8fe//32LZx52lRkzZhSFf03Nnj07vvGNb0R9fX1ERPzlL38pCv8qKipin332aRH2HXbYYbFs2bLOKzQAAABAN7JOBID/+te/Cn9XVFREv379OmS+1157bcyfPz/mzZvX4nmATz31VOHvk08+uWjc7bffHsuWLYsXXnihKLw6++yzo76+Ps4///x49913i56tN3r06Hj33Xfj3XffjbFjx7ZanpqamqJaiVdffXU8//zzcf/998fMmTMLjXKMGTOm6PbirlZRUREPPvhgNDQ0xH/+85+i9Xv55Zfjtttui4iIv/71r0Xvqaqqivvvvz9effXVuP766yMircW5ySabFD0vEQAAAIBVt04EgDNmzCj83adPnw6Z58SJE+Poo4+OHj16RHl5eZxwwglF4+fNmxcREVVVVYWGPCIi9t9//zjooIOiZ8+esf3228cpp5xSGDd16tR49dVXY+DAgfGJT3yiKKisqKiIT3ziE/GJT3wievbs2WqZmtdaPP744+Nb3/pWTJo0KRYsWBBPP/10LF26NJ5//vkWLSJ3pQsvvDD22muvyOVysfnmmxcaN8l7/PHHI6K4JeHq6uoYO3ZsXHbZZfH000/H17/+9Zg9e3bMmjUr7r333th+++3X6DoAAAAAZNU68QzAESNGFBoBmT17dtTX17d7i2978s+lyxs0aFBR/9KlSyMi4t133y0aPmfOnDjrrLMK/R988EHR+OnTp8fo0aNXqUx9+vSJPfbYIx555JHCsN/+9rfx29/+NiLSmn8HHHBAHH300TFkyJBVWkZn+MIXvlDUv/XWWxd9Zq+//npEpK0iN/XCCy/ECy+8EBFpQDpx4sQ47LDDWswPAAAAgFW3TtQA3GabbYr6Z82a1ea0jz32WNx8881Fz/Brzcc+9rGi/rZq5b333ntF/VOmTIlLLrmk8GreoMW///3vFS63PTfddFMMGzas1XEvvPBCnHPOObHxxhvHXXfdtVrL6UiDBw9uMWzo0KGFv/Mh6s477xwXX3xxq/Oorq6OW265JfbZZ5/Ya6+9ora2tnMKCwAAANDNrBMB4NZbb13Uv6IGMH7wgx/EYYcdFhtttFF87nOfi/fff7/V6ZrfSpzL5VqdrnkDIcOGDYuJEye2+Ro4cGApq9SmTTfdNKZOnRq33npr7L333m1Od8ABB0RVVdVqLaujzJ07t8WwpgFs0zDwzDPPjBdeeCFOP/30ouFNTZ48OS677LKOLygAAABAN7RO3AK87777Fj3z7sc//nEceuihLcK2++67L6ZMmVLof+WVV1b7VtmPf/zjRf2f/exn4ze/+U2hf/HixfH+++/HFltsEeXlq7856+rq4oMPPoiNNtoo7r777kiSJJ599tmYPHly/PrXv47Zs2cXpn3llVdi5513Xu1lrq7HHnssttxyy0J/VVVVvPzyy4X+kSNHFv5esGBBJEkShx12WFx66aXxzjvvxJQpU+IPf/hD3HfffYXp8s8NBAAAAGD1rBM1ACdMmBCHHXZYoX/27NkxevToeOihh2LBggVRVVUVv/nNb+L//b//V/S+U089dbVDuU984hNRWVlZ6L/99ttj/vz5hf6TTjopRowYET179oztttsuXnrppVVe1g033BA9e/aMYcOGxec///n4+c9/Huutt17ssssuccEFF8T5559fNH2vXr1WeVkd6fvf/3688847ERHR0NDQopw77LBD1NfXx/Dhw2PAgAGxww47xIQJE6KqqiqGDRsWhx12WNx9991FtS2XL1++RtcBAAAAIKvWiRqAEREXX3xx0fP2ZsyYscJbZEeMGBHf/va3V3u5PXv2jHPOOSdOPPHEiEifVTd69Og49NBD47XXXou77767MO2SJUtaPK9wZTRfnzPOOCOeffbZGD9+fEydOjVuv/32ovGr2thIR5s9e3Z8+tOfjj322CPefffdotp/lZWVceSRR0aPHj3iS1/6Uvz85z+PiHQ7/td//VcceOCBsd5668VDDz0U1dXVhfdNnDhxja8HAAAAQBatMwHgpptuGq+88koccsghRQFTayorK+Ohhx5a7efx5X3rW9+KJ598Mm655ZaISMPH1hqz+P3vf79aNQ433XTTuPHGG+Mb3/hGYdikSZNi0qRJLaa9++67Y7311lvlZXW06urqojA071e/+lX069cvIiLOO++8ePLJJ+PZZ5+NiIg33ngjLrroohbvGTt2bPzP//xP5xYYAAAAoJtYJ24Bzttmm23imWeeiXPPPTdGjBjRYnxFRUX85Cc/iTfeeKPFs/uaN/LRo0ePov6ysrI2+9dbb724+eab42c/+1mrLfROnDgxXnrppRg/fnyby2g+/7Ycdthh8eijj7ZZu3H33XePKVOmxH777VfS/JprXo62Gj9pPnxF5Z86dWrsvvvuRcOGDRsWTz75ZHz5y18uDBs4cGA89NBDcd5557VoXCUi/fx+/OMfx8MPPxx9+/Ztb1UAAAAAKEEuSZKkqwuxqhYtWhSvvfZa1NfXF57V11ag1ZEWLlwYr732Wqy33nrxsY99rMNqGja3YMGC+OCDD2Lu3Lmx4YYbxuabb16oTdeVLrroojj77LML/QsXLoyKioqYN29evP766zFs2LDYeOONVziPurq6mDlzZsyYMSPKyspi6NChsfHGG5cclAIAAABQmnU6AKRrtBUAAgAAALD2Ud0KAAAAADJMAAgAAAAAGbbOtALM2mPLLbeML37xi4X+nj17dmFpAAAAAFgRzwAEAAAAgAxzCzAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEACgG/voo49ixowZUVdX19VFAQCgk6wTAeAOO+wQuVyuzddxxx1X8rzOP//8yOVyUVtb22nlTZIkJk2aFP/85z8jImLJkiWRy+Xi3HPP7fBl1dbWxkUXXRRLlizpsHk2L//a5vLLL49cLhfz589vdfwFF1wQuVwuFi9e3OllGTduXHzuc58r9D///PNxxx13FPp33XXX2HnnnTu9HB3h8ccfj1wuFw899FBXF2WF1uTnW6qnnnoqcrlc3Hvvva2Or62tjVwuF+eff36HLrf58f/oo49GLpeLhx9+uMOX0xnlb03zY6i5VdlPO+M8uTqan2OXLVsWuVwuzjnnnDVelqbnsAceeCByuVw88cQTa7wca4s1dR5s7zrWno46Dy5evDh23nnnqKysjM033zz+/ve/r9b8IiJmzZrV6ne14cOHx/HHHx/vvvtu0fSf/exnY6eddlrt5a6Nmh/rnXWOXrhwYeRyubjwwgtXuWwdpb1z+Jry4x//OHK5XNTU1KzS+I6yNlx/9t1339hhhx26bPkr0l7ZOmM/7ajjcE3tQwAdaZ0IACMiKioq4rzzzmv1tc8++5Q8nyFDhsSYMWOirKzzVv2JJ56Igw8+OD766KOi4Q0NDR2+rCuuuCLOPvvsqK+v77B5tlX+tUV+OyZJ0ur4/PC2xnek+vr6QnmWLVsWO+64Y7z00kuF8Q0NDR362XSmioqKGD9+fPTv37+ri7JCa/LzXVnt7ZMdfQ5ofvx31nI6a77NtXYMNbcq+2lnnCdXR/NzbFlZWYwfPz423XTTNV6WpuewvLXx2FpT1tR5sL3rWHs66jz4wAMPxJQpU+LII4+Me+65J3bcccfVml/TMo0dOzbOO++8OPfcc+P444+P4cOHx69+9asYNmxY/Pvf/y5Mv/XWW8c222yz2stdGzU/1temc3RnfNcr5Ry+prR3jG2yySYxfvz4Tv09ELF2XH8aGhrW2tq97ZWtM/bTjjoO19Q+BNCRyru6AKXaaKONOqQG3bHHHhvHHntsB5SobZ39I7mzl7Umy8/aY8yYMfHkk092dTFYCd3xWF2V/XRt207Ny1NeXu7YW0t0t/Pg7NmzIyLi9NNPj6222qpD573zzju3+N52zTXXxHHHHRdHHnlkTJ48OcrKyuLqq6/u0OWuTda2c09Ta3PZ1oSjjjoqjjrqqE5fTnffzqtrbd5+a2ofAuhImfqXxXPPPRfjxo2LRx55JL70pS9FLpeLkSNHxi9/+cvCf3uuvfbaGDduXCxdurTwvjvuuCPGjRtXmP773/9+oar+aaedFrvuumssW7asaFnHHXdcfPWrX21Rhqeeeiq+/e1vF6Y588wzC+OqqqriW9/6VvTv3z8233zzOPnkk2P58uWF8TU1NXHyySfH8OHDo3///rHXXnvFCy+80Ob6Xn311XHVVVdFRHqr6R/+8IeIiPjzn/8cn/vc56J///6Ry+Vihx12iPvuu6/wvtra2jjxxBNjyJAhkcvlYty4cfG73/2uzfKv7DbIa68cxx13XFxwwQVx2WWXxciRI6N///6x3377xXvvvVc0n+uvvz523nnn6N+/f3z5y1+O999/v81lNnXPPfcUbh//7Gc/G3/729+Kxr/99tvx5S9/Ofr37x9DhgyJb3zjGzFr1qzC+OXLl8cVV1xR2DdyuVzsu+++8frrr7dYVv42qoh0H9t1110L4+rr6+PSSy8tfK5f/vKX44MPPoiIiJtvvjnGjRsXL7/8ctH8rrvuuhg3blzMnTu3xbKuueaa+O///u+4/PLLo3///rHvvvtGTU1NNDQ0xC9/+cvYbrvtCtv71ltvLXpvdXV1fPe7342RI0fGkCFD4tRTT41XX301IiJeeOGFGDduXDz11FMRkX4+Z599dpx11lkxZMiQGD58eJx11lmxaNGionk+/PDDseuuuxaOn3PPPbdoX3n33Xdjv/32i/79+xf266effrqNT23F+2dT7X2+7e1/bW3HUo7DJ554Ir785S9HLpeLnXfeOaZMmdLm+jQ1c+bMOPjgg6N///4xcuTIuPLKKyNJkqirq4tdd901TjzxxKLp8/vV5Zdf3mJebR3/ERGvvfZafOELX4hcLhfbbbddXHPNNUXvXdlzTXvlzytlH7zqqqti5MiRhf3l7LPPjqVLl67wGGqqtf10ReeRtrZTe8d/a/vH3/72t3avMRER//rXv+Lggw+OzTffPHK5XGy++eZx+eWXR319favn2Lq6uhg3blxREPLPf/4z9t1330L5DjvssMJ5IyI9R3zpS1+K++67L3baaafCsfLoo48Wba/2joO2vPLKK60ee02vs809/fTTMW7cuLjxxhtjyJAhscMOOxRu22rvPBER8b//+7+Fc/3Xv/71+Mtf/lIYt2TJkrjwwgsL+84OO+wQN954Y9H7lyxZEueee25ss8020b9///j2t78dl1xySey7774RkdZOGjduXNx2221xzDHHxJAhQ2LIkCFx4oknFm6lbbp/TZs2LcaNG9fq69RTTy0st5R1W5Xr2GuvvRZHHHFEDBkyJLbZZpv4yU9+EnPmzCmaZnWuc9///vfjvPPOi4iIr371q4V1ausY7QjHHntsHHnkkfHII4/EtGnTIiI9Dpr+iG5v+Su6jkWk56n859v8mrN48eLYaaedCuud99FHHxUdg+2dI9u6fjS1ou+DnXWObmplz0MRa+YcHrF6n+HKTNPUm2++GTvvvHPst99+sWjRovjtb38b48aNi8WLF5d0boho/xzTXGvXn1U9T5ZaxqVLl8Y555xT2LZnnHFGi+N3ZY/xUr4Pl/KdvpSyNdXWflrK8dHePhax4uOwlN+UTfeh/Gf0/e9/v7DM0047LS666KL4+te/HhERixYtinHjxsX1119fVI5TTz21ME3E6h2HAO1K1gFjxoxJhg0blixbtqzVV95f//rXJCKSiEhGjRqVnH766cmwYcOSiEjuv//+JEmS5Ic//GESEcmiRYuSJEmSG264IYmIZMyYMclll12WHHrooUlEJPvss0+SJEly3XXXFb0/SZLkww8/TCIiOeecc1qU9c0330wOO+ywJCKSQw45JLn11luT2traQrmGDRuWnHHGGcmYMWOSiEh+8IMfJEmSJHV1dcnYsWMLy77ooosKZX/ppZda3S6TJ09OdtlllyQiku9973vJk08+mTzyyCNJRCSjR49Ozj///OToo49OKioqkohIZsyYkSRJkpxxxhlJRCRHHXVUcvHFFxeWe++997Za/pXdBkmSlFSOCRMmFLbLYYcdVljumDFjCvOZNGlSEhHJHnvskVx66aWF7RYRydy5c1td9nnnnVeY5pBDDkmOPvroQv/s2bOTJEmSGTNmFMpz9NFHF/aLoUOHFvaN888/v+jz2H///ZOISEaMGJE0NDQU9s3dd989Wbp0aXLOOeckEZFMmDAhufDCC4vWsaKiIjnxxBOTiRMnJhGRjB8/PkmSJHnttdeSiEjOOuusonUYNWpUYZrmfvCDHxTWZ/To0YXtlf9cR40alVx44YXJnnvumURE8utf/zpJkiRZvnx58sUvfjGJiGT33XdPzjrrrKSioiIZMWJEsnz58uTRRx9NIiJ54IEHWpT9yiuvTM4888zCZ5V3zz33JBGRVFZWJmeffXZhW3/ta18rTDNmzJikoqIiOeOMM5If/vCHSWVlZRIRyUcffdTq+q1o/yz18y1l/2ttO5ZyHL755ptJRUVFMmzYsOSSSy5JDjzwwMJ8/vznP7e6TosWLSo6N1122WXJ3nvvnUREYV855JBDkohI5s2bV3jfnXfemUREMnny5BbzbO34b3oO3GWXXZLvfOc7he394IMPJkmyaueaUspfyj547733JhGR7Lnnnsmll16aHHDAAUlEJGeeeWabx1Bzbe2nbZ1HWttOpRz/re0fpVxj5s+fXzgmvvOd7yRnnXVWYZpJkya1eo5dunRpoXxJkiSvvvpq4di74IILCsdeZWVlYf9oehzsueeeyamnnlronzNnTsnHQf4cliRJcv/99ycRkTz++OPJ8uXLk8rKymTs2LFF2/+UU05JIiKZP39+i8/mwQcfLJRhxIgRSUVFRTJz5sySzhMXX3xx4Rr5wx/+MBk1alQSEclbb72VJEmSHH744UlEJAcccEBy+eWXFz73Sy+9tDCPE044IYmIZOzYscmPfvSjwr5fWVmZJElSdC2uqKhITjrppMK+ceaZZ7bYv2bMmJGcfvrphdcZZ5yRjBgxIomI5Dvf+U6SJKWdA1flOvbhhx8mQ4cOTSIi+eY3v5kcc8wxhf21+ee/qte522+/vXAcn3LKKclNN920wmO0VPnvB6eeemqr4/PntbvvvjtJkiTZZZddCvtZe8tv7zq2aNGiwmd0yimnJBdeeGFhO06ZMiVJkiT54he/mFRUVCRLliwplOnaa69NIiJ58sknSzpHtnUdbqq1Y72zztFVVVVJRCQ/+tGPkiRZtfNQkqyZc3hHfIalTJP/Drdw4cLkP//5TzJ06NCkoqIiefXVV1uML+XckCTtn2Oaa+36s6rnyVLLeOKJJyYR6Xeon/zkJ4VzwOjRo9v9DNtSyvfhUr7Tt1e25lrbT0s5Ptrbx0o5Dku53jfdh5ruHyeddFLyi1/8ojDPUaNGJUnS8jjN23vvvQvTJMnqHYcA7VlnAsD8Sbi114svvpgkSePJ+oADDii8Nx+wnH766UmSFAeAdXV1hQtCbW1t4T35L9qTJ09O5s6dm0REcvjhhxfGX3311UlEJK+88kqr5c3/gMhfRPIX7crKykLokf/Bl//hddtttyURkZx//vmF+Xz00UdJRCT7779/m9smf/GpqalJkiRJjj/++CSi8cdd0/LmA5QxY8YUXWxnzpyZ7LLLLskNN9zQavlXZRuUUo78l4UXXnihME3zEKSioiIZPXp0snz58iRJ0i99+S957QWAF198cWHYZZddlkRE8thjjyVJkiTHHXdcEhHJo48+Wpgmf0G98sork4aGhmTEiBHJ6NGjk7q6usI0Bx10UFH5mv54zn+m+VC36Trmv5AmSZLss88+SUQUfnyMHTs2qaysTOrr65MkSZKXX345iYjkmmuuaXX98j88rrjiiiRJkqShoSF57733ivanJGn8EVFRUZHU1tYmd911VxIRyS9+8YvCNHfffXcSEcmNN97YZrCS3w+SpHF/e+aZZ5KGhobCl6GqqqrCNPmw4vnnn0/mzZuXRERy2mmnFcZPnjw52XPPPZOnn3661fVrb/8s5fMtZf9rbTuWchzm99Gm8z744IOTiNICwPy+U1dXl4wfPz6pqKhIFi1aVAhf8uuZJOn+VllZWdj/m2t+/OfPgfvss09hv33qqaeSiEguuOCCJElW7VxTSvlL2QfzX2rzAVVDQ0Ny8MEHJ6ecckqSJK0fQ821tZ+u6DzSfDu1d/wnSev7RynXmFtvvTWJiOSuu+4qTJM/ps8444yidcgfW80DwPyX+TfeeKMwj1tuuSWJiOSHP/xhkiSNx0HT4zn/z5p77rknSZLSrwetBYBJkiRnnXVWEhHJm2++mSRJ+qOqoqIiOfjgg1v9bPI/bCdOnFjYZqWcJ/L730EHHVTYb99///0kIpIjjzwyefHFFwt/5y1fvrwQAMybNy/517/+lUSkYVnz7d48AKysrEyqq6uTJEmSJUuWJEOHDi2cc5rvX009+eSTSUQabixZsqSkdUuSVbuO5QPdpvv1aaedVvg8OuI6lyRJctVVVyURkfznP/9JkiRp9xgtRXsBYH475sPbpgFge8tv7zr2y1/+MomI5Pe//31hfP7cNGHChCRJkuT2228vOgbyZRg2bFjJ14HWzg+taX6sd9Y5unmwsCrnoTV1Du+Iz7CUafLn/TfeeKMQtL388suF6VsLAFd0bijlHNOa5tefVT1PllLG/PWo6fGaL2N+mpU9xkv9PtzetbiUsrWm+X5ayvHR3j5WynFYyvW+6T70xhtvtDjv5f+ZtzIBYEcchwArss48A7CioqLN1n432mijov4DDjig8PeIESMiIq0K3txrr70W1dXVcd5558V6661XGL7ffvvFtddeG88880x8/vOfj4MOOih+97vfxVVXXRV9+/aNG2+8MUaPHr3SD63ea6+9YsMNN4yIiF69esUXv/jFwoOS//GPf0RExKabblrUKtWIESNavdWqLVdddVVceeWVEZHe7vDGG2/Ec889FxFRqKK+/fbbx29/+9vYa6+94uCDDy7c2taWQYMGrfQ2KKUcERGVlZWx/fbbF/q33377uOWWW6KmpiaWLFkS1dXVccQRR0R5ebqr9u3bN4488si44IIL2t0W//3f/134e7fddouIiOnTp0dExF//+teISKvrN28F7KmnnoqTTz45pk2bFsuXL48lS5bE22+/HVOnTo2ZM2dGRHqb6sCBA9stQ0S6737mM58p9H/+85+PBx54ID744IPYYost4ogjjojjjz8+nnrqqfjMZz4TkyZNioiIAw88cIXzzd9yksvl4vnnn4+IiFGjRhWtzyc/+cl49tln45133incZvzNb36zMH7ixInx4YcfxsYbbxyPPfZYq8vZZZddCn8fcsghce6558bLL78cH/vYx+Kdd96JCRMmxLPPPluYJr9dnnvuuRgzZkwMHTo0fvrTn8a8efNiv/32i89//vMrbGGz1P1zRZ9vqftfRPF2LOU4fO6552LixImx2WabFcYfffTRcfvtt7e5Tnl77713Yfv06NEjjjzyyDjmmGPirbfeis9//vNRUVERt912Wxx++OFRVVUVkyZNitNPP72w/5fqwAMPjB49ekREFFrXyz90f3XONSsq/1tvvRURK94Ht91224iI+MxnPhNHHnlk7LvvvkW3Lq+qFZ1HWjtOSzn+85ruH3krusZ87Wtfi6985StRVlYW06dPj7feeiueeeaZiGi577UmSZJ4+OGHY+LEifHJT36yMPwLX/hCRESLVlqb3nqW3775WxFX5jhozSGHHBIXX3xx/PGPf4wzzzwzHnvssaiuro7DDjtshe/bb7/9IiLdZrNmzWr3PJHfdkceeWRhv910001j1qxZMXjw4LjuuusiIoqe31teXh4HHHBAXHLJJfHqq68WbjPLLzu/PYYOHdridtwvfelLsf7660dERO/evWPbbbct3I7alunTp8f+++8flZWV8cc//jF69+5d0rptsskmq3Qd++c//xljx44t2q9/9KMfxTnnnBMDBgwoDFvd61xznXWMNpXfP1t7cH57y2/vOpZ/LMkhhxxSGP/xj388xowZE1OmTInly5fHF7/4xYiImDRpUnzxi1+M6dOnxxNPPBE/+tGPSr4O5LV2fihFZ52j81blPFTK94iO2D864jPMfydY0TR5e+yxR8yYMSP+/Oc/F8rflhWdG1588cWIKO0cU4qVPU+OGjWq3TK+8sorERFx6KGHFpVxwoQJhfPsyn6GuVyu5O/DK7oWl1K2UpRyfLS3j+WnW9FxmFfqb8p//etfEZEee3mjRo2K3XffvfCs1VKsqeMQ6L7WmQBwo402iksuuaTkafPKysqioqKi1da38s/BafpDPqIx8JgxY0ZERBx22GExadKkeOihh2L06NHx1FNPFX5UrYzNN9+8qL+ioqLwpeHdd9+NiDRIaE1NTU3hgr8i8+fPj5NPPjluuummwrDKysqiaS677LKYM2dO3H333TF58uSISL8g/e53v2tRxryV3QallCMi/cLWVN++fSMiff7F22+/HRFpy81NtVXG5pq2qNmvX7/CfCMi3njjjYhIA43m8st9+eWX44QTTognnniiMK6ioqKkZTfV/MHq+c8xX5YDDzwwjj/++Ljjjjti/PjxccMNN8T+++9fCIvbMnTo0MLf+S8sV199dasPVJ8xY0ZMnTq1xTrkcrnYeOON21zG7rvvHn369Cn05z/DN998s/Ajc8qUKbHnnnu2eG++TA888EB8/etfj9/97neF54kde+yx8dOf/rRo3nml7p8r+nxL3f8iirdje8dhVVVVvPHGG7H77ru3OY8Vyf9Az9tkk00iIuL999+P0aNHxzHHHBNXXHFFzJo1qxCSNv0yWaqm57Ty8vKic+DqnGtWVP5S9sFDDjkkXnvttbjwwgvjrLPOirPOOiuGDRsWN9xwwwqfF9WeFZ1HWlPK8Z/X2me7omtMQ0NDXHjhhXH55ZcXfiQMGzas1FWJ2traqK6ubrHcgQMHxtixY1s8I7VpWfLHU74sK3MctGbbbbeN0aNHx4033hhnnnlm3HrrrVFRURF77bXXCt/3sY99rPB3qeeJiGjRCnK+rB9++GGr43fbbbe45JJL4v333y98puPGjSuaZq+99op77723aFjza8r666+/wh/wNTU18aUvfSlmz54dzz//fOH9pazbql7Hnn/++RbbuW/fvoV9O291r3PNddYx2lT+M//EJz6x0stv7zr2/vvvx7Bhw1r80yT/jLDZs2fHZpttFscee2xcc801cfXVV8cf//jHiGg815Zyjswr9dzfXGedo/NW5Ty0ps7hHfEZljJN03JHRPz617+OiRMnrrBsKzo3rMw5phSrep4spYzNK0Z8/OMfLwRwq/IZlvp9eEXX4lLKVopSjo9Sv++u6DjMK/U35ZtvvhkR0eK7+yc+8YmVCgC78rsU0D1kqhGQvFKbY8+f1BcsWFA0vLa2NiIihg8fHhHpxb2ioiL++Mc/xl133RUREQcddNBKlyv/X6bWDB48OCIiXnrppfjwww9bvPJf7Nvz7W9/O2666aY45ZRTCrXMfv/73xdNM3DgwLjrrrti9uzZcdNNN8X+++8fjzzySJsX04iV3wallCNixdsk/8Om+edT6n8KV7QfVFZWxtChQ1vd1g888EAsXbo09txzz3jxxRfj0ksvjb///e+xcOHCOOGEE0padlMrWsd8Wfbff//4v//7v3j22WdjxowZ7dawiYiiL775/75ec801ra7TbrvtFoMGDYqIiLq6uqL5PPXUU4Uf2O2ZP39+RERsscUWhVoo3/zmN1td5llnnRUREZ/61KfipZdeitdeey2uvPLKGDNmTFxzzTXx85//vNVllLp/rujzLXX/iyjeju0dhxtssEFUVlZGVVVV0TyaP/y9VPntmf9Smq/NcO+998Yf/vCHGDFiRIwZM2al57uibdNR55rm5S9lHywrK4uf/OQnsWjRorj//vvjuOOOi3feeSd22223QsNLq6K9Y6y59o7/plqrfbmi7XvNNdfEueeeG7vuumvcfvvtMWPGjELNkVJaM+zbt29UVFQUtm1TCxYsiK233rrksqzMcdCWI488MqZOnRovv/xyXH/99XH44YdHr169VvieptuslPNEft9pfq6fOnVqTJs2rRAEtnWt3mKLLQrTNL8+tHa9WJn9paGhIb7xjW/Eyy+/HLfeemvR8VjKuq3qdWyjjTZqsQ8sWrQonn766Vi4cGFh2Opc51rTWcdoU3fffXdERKFG08osv73r2MYbbxwfffRRi/nma73lw5P8uXby5Mlx0003xYQJE2LLLbeMiJU7R65s7eym69mWjjhHr8p5aE2dwzviMyz1c45Ia02fdtppcd9997VoSKG5FZ0bVuYcU4qVPU+WUsb8PxaaniOal3FlP8OV+T68umUrRSnHR6nfd0v5vVjqb8r8PlfKub75P5yalqkrv0sB3UMmA8BSbbHFFhERceeddxYNv//++yOisZp8796944gjjoi77747Jk2aFHvvvXeLmghN5W8Fae0/RG3JfxH+xz/+ERtvvHFsvPHGMWjQoDj44IPjmGOOaXdZdXV1kSRJTJo0KSZMmBA/+9nP4gtf+EJssskm8eSTTxamybc2esQRR8RGG20Uhx56aNx1110xatSoQrX61sq/MtuglHKUYosttoiKior485//XDT8wQcfLOn9K7LDDjvEjBkzYs6cOYXtvWjRothtt93immuuialTp8bs2bPj1FNPjdNPPz0mTJgQffv2Ldx+19o65Ldb01tPSvWNb3wjZs+eHeeee25EROyzzz4r9f58LcNHHnmksD4bb7xxXHbZZbH33nvHnDlzCsFB09tpX3jhhdhpp53a/FL82GOPFbVEnN/22267beE/vXfeeWdsuOGGhWX+/e9/j9122y3+/ve/x2uvvRYjR46M3/zmN7HVVlvFySefXJhH85aPI6Kk/bM9q7P/lXIc7rLLLvHAAw8UhX6l3pbV/Nbn/A/hfM2MMWPGxIgRI+Lmm2+O++67r6hlzNY0Pf5LtarnmvbKX8o++J3vfCe22Wab6NGjR+yzzz5x9dVXF1oefe+991brGFqR5tupveN/deRv2bn99tvjoIMOis0226zQMmV++e1dI0aPHh333Xdf0Q+HadOmxRtvvFF0e9WKdNR5OP+PnjPOOCMiim+5K0Up54l88NL0OKqtrY3x48fHd77zncK+1byWTf6fUaNGjYodd9wxIqLwCIWIiDlz5hT1r4pzzz03/vSnP8X3vve9FrVxS1m3Vb2ObbfddvHII48UhYC/+c1vYvz48e3erpy3Kvt5e8fo6rrzzjvjnnvuiT322KNFmF3K8tu7jm2zzTZRXV1daCU8Ij2f3HnnnTF+/PhC6DJhwoQYOnRoXHbZZfHCCy/E4YcfXph+dc6Rza3J74NNrcp5aE2dwzviMyz1c45Iz6fnnXdeVFZWxre//e2Vqo3V1KqeY0q5TpdyLinFdtttFxFRFPBXV1cXPeJlZY/xVfk+vKpla03z/bSU42NVvu+urny58r8hI9J/2jT93pSvEZmv8RmR/iO16ffhjjgOAVaoS59AWKJ8K6LnnXdeq6/8Q5jzD2xt2mhBkqQP4M63nNe8FeD8Q85POeWU5Mknn0xuuOGGwgO78606JUnjw2EjIrn55ptXWN5nn3228EDzBx54oPDg3uYPRf7a175WeHDwvHnzkoqKiqSioiI5//zzk8mTJxdavsq3ztaan/70p4V5v/jii4WHx0+aNCl56aWXkksuuaRQ7nzrUd/73veSiEguuuii5Iknnig8TPmQQw5ptfyrsg1KKceECRNatHSbfyD5e++9lyRJY8uQZ555ZvLkk08Wyh4reHh6/uHo+QcuJ0njg3jzD4z+29/+lkSkLZhde+21yZ///OdCq2JvvfVWUltbW2gg5tFHH02efPLJQiuUEVFoRa7pA/STJCk87Pf6669PGhoaWl3HX//614Xl5C1evLiotcYVyT98vGnDNUmSJHvssUcSkbauNnny5OTCCy9MIiI59thjkyRJWwWsqKhIhg4dmtx7773J3XffXWjEoaamps3GFfbYY4/kiSeeSG666aakoqIi2XPPPQsPs//xj3+cRKQtkf3pT39KbrzxxqSioiIZNWpUsnTp0qShoSEZP358UllZmVx//fXJY489Vmgl7brrrmt1/drbP0v5fEvZ/1rbjqUch/l9Z5999kn+/ve/F1oSjyitEZATTjghmTJlSqG1xOYtQOcf5N/0OGhL8+N/RefAfCM+q3KuKbX87e2DDzzwQBKRNij08MMPJ7feemsydOjQokZwmh9DzbW2n7Z3Hmm+ndo7/pOk9f2jlGvML37xi8IDyl9++eXk1ltvLRzb+dYcm59jmzcCkm+MY4899kgee+yx5MEHH2zxEPvWjoOXXnqp6PxcynGwokZA8vINF+UbSWhL/uH2f/3rX4uGt3eeSJKk0BrtNddckzz66KOFffLZZ59N6uvrC5/PVVddlTz99NOFa/lRRx1VWE5+HqeffnpyzTXXFFoSbt4ISPNr8UEHHZQMHTo0SZLi/Sv/IPmKiork5ptvTm666aaiV6nrtirXsSlTphTm+/jjjyfXX399UllZWWg4oCOuc0nSshGQUo7RAw88sOjB+M3lGwEZO3Zs4XvaCSeckOy+++6F7dm0MYamjYC0t/z2rmP5ZQ8dOjS5++67k7///e+F4yD/meXlH+AfUdz6einnyLauw801P9Y76xzdvHGBVTkPJcmaOYd3xGdYyjTNW2jNt7Kb/y7RWiMgKzo3JEn755jWNL/+rOp5spQy1tXVFVq//c1vfpNMmTKl0J9vaKOUz7CpUr8Pt3ctLqVsrWm+n5ZyfLS3j5VyHJZyvW++j+Wvlddcc00yZcqUQmvJTVv4HT16dBIRydVXX53cddddhe/aTafpiOOwvfM00H2tMwFg/kLT2it/4X3kkUeSiLT13qYqKiqSY445JkmSlgFgbW1toZWt/GvixInJ7Nmzi+bRtIWupsFga2pra5Nddtml8IMpf9E+55xziqY75JBDir40vPjii0XrOmLEiBYtRTX35ptvFpqZP/7445MXXnihcOHIL/+Pf/xj0Ze+efPmFVrmyr8OOOCAwg+R5uVflW1QSjl22WWXFl8W8i1U/vvf/y4s86yzzip8cc1fnJt/YW8qf0HOf8ZJkiRTp05NItLWv/ImTZpU2HYRkeyyyy7JHXfcURh/6623Fr7cRaQtcl1//fVFP57Hjh2b7LHHHoX3NN2X3n///VbX8dprr00iInn77beLhh977LFJRHGLja3JBy/Nf3jMnj270Bpt/rg45phjirbTCy+8UPjykZ8m3xLiY489VvRlZ8KECUllZWUyfvz4omOj6fyWLVtW9CMqvy/lW+ZOkiR5/vnnkz333LNomrPPPrvNlm3b2z9L+XxL2f/a2o6lHIe33npr4ViIiOTEE09MIhpbX20uH6AddthhRfvciSee2GL5//nPfwr7Y3uaH/8rOgc2bUF1Zc81pZa/lH3we9/7XtE8Ro8eXfgBkSQtj6Hmmu+npZxHmm+nJGn/+G9t/yjlGjN//vyibVBRUZFcccUVyd57751UVFQk9fX1Lc6x+QDw7LPPLswz/+MvP5+xY8cmTz31VGF8a8dBvkXFW265JUmS0o6DpuewfAD4t7/9rWj98i2KNm11sTWTJ09OIiJ55JFHioaXcp6YNWtW4Qd8/vWTn/ykMP79998v/JjKb9dTTz216DxSU1OTHH/88cnQoUMLn8kee+xR+HHc1rW46Q/opvvXBRdcsMLvHqWu26pcx5Ikbe2y6T4wZsyYQsvQHXWdyx8r06dPLwxr7xgdOnRoUSjS3MyZM1vdXiNGjEiOPfbY5PXXXy+afvfddy8EgKUsf0XXsSRJkqeffrro2j1s2LDk+uuvb1HO119/PYlIg4Xm2jtHtnX9aK75sd5Z5+gFCxYkEZH8+Mc/TpJk1c5DSbJmzuFJ0jGfYXvT5I/ffIu5SZKGIs2P7+rq6pLODUnS/jmmNc2vP6t6niy1jFVVVUXn0jFjxiR77rlnUcjW3mfYXCnfh0u5FpdStuZa209LOT5WtI+VchyWcr1vvo/NnTu3xfqNGDGiKNx75plnCq3AR6QtFx9++OFF26AjjsP2ztNA97VOBIAdKV/jb8mSJUXDly9fnrz99tsthuctXbo0qaysLPxnqBRz5sxp98tha6qqqoq+jLenrq4umT17dtEPoTlz5iSzZs1a4fsWL16cvP76622uc/Pyr+o2aK8cpVi+fHny7rvvtvrfydX1wQcftFkLI0nSQKa9wLOpBQsWJAsWLFjpcuS/xNXV1a30e5uqra1N3n777RVuqzlz5iTvvvvuCpfV9L+5H3zwQdGX6Obq6uqSd999N1m8eHGb08ydOzd5++23S16/9vbPUqzO/lfKcfjvf/97pcvX0NCwwm2VD3HytXja09rxX6qVPdckSfvlT5L298H6+vrkjTfeSKqqqlodv6rH0Iq0tZ3aO/5X1eLFi5P33nuv3eNwRdeIhoaGZPr06cn8+fNXqyyrex7O/4h78803V6scpZwnFi1alLz55pttHleLFi1K3nnnnRbbdcaMGckNN9yQzJw5szCsvr4+GTVqVIsfpZ2hlHVbletYfh/48MMPV6t8K7uft3eMdrZSlt/edWzu3LltBlClWpVzZGvW1PfB5lb1PLSmzuEd8Rl2xOdcitU5x6zMdbqUc0kpampqVrhdVuUYX9nvw6tatta0tp+WcnyU8n23o1VVVSUzZsxIkiStNdo0AMybPn16u8fI6h6HAK3JJUmSRDdQX18fM2bMiKOPPjomT54cpa52dXV1vPrqq3HHHXfEFVdcEU8++WSMHz++k0u7drENOtdTTz0Vr7/+ehxxxBFxySWXFJ611dV23nnnqK+vLzwzjM41ffr0mD59epx33nmFB1WvTIMc0NFeeeWVmDlzZnz961+P7bffPv7yl790dZHa9M4778Tw4cNj//33j8suuyz69esX1113XZx33nlx4YUXxve+972uLiKwDnOOYVV84QtfiOnTp8err77a1UUBiIiIVWu+bB30yiuvxKc//emIiDjvvPNKft+MGTNip512ioiIk046qVsGX7ZB59prr72iuro6xo4du0qtDJMNf/nLXwoPsL7jjjuEf3S5iy66KG655ZaoqKiIX/ziF11dnBUaNmxYXHjhhXHRRRfFiBEjCsOPPvroteafKsC6yzkGgCzoNjUAq6qqYsqUKTF69OhCU/SlqKuri0ceeSQGDx4c22+/fcnNwWeJbdC5nnnmmaipqYmddtop+vTp09XFKfj3v/8dEY2t09G55s2bF//4xz9i1KhRhVaBoSu9/fbb8frrr8f48eNj8ODBXV2cktTX18cLL7wQCxcujDFjxsTAgQO7ukhAhjjHsDKmT58edXV1scUWW3R1UQAiohsFgAAAAADQHanKBQAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwrjzi/q8sAAAAAAHSSXJJE0tWFAAAAAAA6h1uAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAaKGuLuL99yM++CCioWHV3t+e+vqOmU8p0wAAQHcmAAQACmbMiDjiiIiePSOGDo3YbLOIAQMiTjstorp6xe9taIj45S8jhg9P39+/f8Q3vpGGiE09+mjEwQdHlJdHjBwZcemlxeOrqiJOPTViyJB0PptvHnH++RFLlqzcNM3ddVdELpd2V8bzz0fcccfKvae5//wnXfYxx6zefJqrrY246KLG9X788XQ5Dz3Usctpz8KF6XIvvHDl33vTTel7d9659fHjxkV87nPp3w88kE77xBOlz3/ZsvQ955xT+nsuuCB9z+LFbU/TfNvTtqeeSrfnvfd2dUkAoPsSAAIAEZGGf7vtFjFrVsTtt0fMnBnx7rsRP/95Gizts8+KQ8D//d+IE0+M2GSTiKuvjvja19JwZ5990hAmv4w99khDwZtuivjMZyLOPLM4BPzGNyKuvDJi990jfv3riO22izjvvIjTT1+5aZpLkuJuKZYti9hxx4iXXir9PSta9qrUplyRK66IOPvsxtqUFRUR48en4euatDrrd/31aXfKlIhXXmk5vr6+5XxX5jMsK0u3yaablv6eUvaV5tue9q3M5wYAdCwBIAAQM2em4d+OO0b8+c8RH/94xO9/H3H33RE77RTxyCNpGLbPPmnNp9b89KdpAPXYYxHHHRdx7bVpLb2XX4547rl0mjPPTLv33BNx6KERN9wQsffeET/+ccTSpWngeM89EQcdFPGHP6Q15u69N2L06LR24fLlpU2ztskHcoMGdex8mwdjY8ZEPPlkGnitC955J91fLrgg7c+HgR2pvDzdJscd17Hz7egwFwCgMwkAAYC4446IwYMjbr454pZbIv7rvyIefjjiH/+I2GqrNKh58MGIN99s/fbLJUvSQO9nP0sDl7zRo9PunDlpYHLPPREHHBAxcGDjNF//elqz8J//TJ/nd+qpESedVDz/7bdPuwsXljZNKZ57Lr299JFHIr70pfQWxZEj0xAxSdLbP/O3pV57bcSuu6Z/52913m679D077BBx662N862tTed77bXpNJtvHvGXv0RUVkaMGpVOc9116TLvuy8NWHO59D2PPto4n+XL01pm48al43O5iH33jXj99XT81VdHXHVV+veuu6Zh6AsvpNM/9VTjfO64o3EeI0dGfP/7jbetLluWjrvttjRIHTIkfZ14YvHtr3PmpNt7m23S+fTvnwZq8+eXtq3bcsstaff//b+IL34x4je/WfFtt+15+ul0fW68MV2PHXZo/JyvvrpxuieeSG9D798/nebOO9Np/vzn4vndc086PpeL+OxnI/72t3R4a9s+Ih02cmTjtj777DTYbsu3vpXeuv7Vr6ZlOeOMdPjbb0d8+cvpsCFD0hqvs2Y1vq+2Nv2Mhgxp3Hd+97viebc3j2uuifjv/464/PJ0mr33Tudz3nnF8/noo+LtV1MTcfLJ6a3+/ftH7LVXut819cQT6bLzt3ZPmdL2NgAA1pAkicTLy8vLy8ure7/23z+SSy+NpK4ukohILr64cdwVV0Tym9+kfx9+eCRnn136fPfcM53fO+9EMmtW+veZZxZP87e/pcPvuqv1edTUpOOHDm17OaVMc+ed6TR33pn2//WvaX9EJKNGRXL66ZEMG5b2339/JEuXRnLOOWn/hAmRXHhh+r4zzmh8z4UXNq7jr39dXJaISCor0zLddVckTzyRboMkieS88xqn2XPPSE49tbF/zpx0mvPPT/v32SeSiy5KP6OISEaMiKShIZLJkyPZZZd02Pe+F8mTT0by6KNp/wMPpPO44Ya0f8yYSC67LJJDD22cZ5JEUlvbuNyKikhOOqlxnk0/p/ywo46K5Mc/jmT8+LT/hBPS8VVVaf+PflT6vlFfn26bMWPS/ltvTedxyy3F040ZE8nuu6d/339/Os3jj7c+zwcfbFyfESPSdfrPfxq3UZJE8uabjdOcdlokX/taY/9vf9vy8znkkEiOPrqxf/bs1rf9vfc2fp6XXhrJAQe0vr83feW3Y0Qko0en+8GMGWm5I9Ll/vCHjfv2okXF++BRR6XH6tixaf+996bjS5nHD35QvOwxYyL54hfT9y1Z0ljGa69Np3nyyfT8kF9Wfr/MHzMvvdS4fSsq0uGXXBLJgQc2LufPf+76c52Xl5eXl1d3fUVXF8DLy8vLy8ura1/Ll6c/zp9+OpLXXkv/fv311qe98cY0AChlvr/5TWMAkSSRvPVWy3AxSdLgICINGprPo6EhksMOS8fffHPryyllmiRpOwA84IDGafLrf/rpaf/SpWn/D36Q9r/3XtqfD6SSpDEUqahIA7V8AFhZmQZj+TI2LUs+YPrFLxqHXXddOuyee9LpR4xIg5m6usZpDjoonWbevLQ/HxLW1KT9TQPAurq0TCNGpOXKz+OYY9JpJk9uDAArKyOprk7HL1mSBkWjR6f977yTTnPqqcX7TEVFY3i3KgHgE08Ub4Pq6rR/l12Kp1uVAHDixMbtnv8M8wHgwQen/W+80XK7Ng8Am+6rl12WDnvssda3fT6Uywe4DQ3psk45pe1tkA8An3mm8T3HHZcOe/TRxuny4eKVVzZuk/znkySRzJyZbrcbbkj7S5lHPgC84orGZd9+e3GQmCTpfIcNS8ffdls6/vzzG8d/9FE6bP/90/5DDkn7Z8xonCa/zQWAXl5eXl5eXfdyCzAAdHO5XPrsvrq69BWRNpyQt2RJ4/DlyyN69Wp/nr/7XXp74+jRjQ189OzZuLym8svKNxSSV1cXcfzxaWMhhx8eccghLZdTyjTtOeCAxr9HjEi7bTV28vzzaXfUqPQW6YcfTm/b/eQn0/e8807jtJ//fMQGG6R/N1/nvH33bfx7223Tbk1NOv20aentq0uWpM9RvO229FmNEW0/h7Gp115Ly/Ttb0est17j8P32S7vPPNM47Etfilh//fTv3r3TstTUpP1bbJE2dHHppRFz56a3F19/fUSfPo3TrIqbbkq7X/hCettvWVl6W+4TTzTe5ryq8uvY2nb/xz/Sz/mTn2w5fXP//d+Nf++2W9qdPr31afOf32c+E3HJJWmDJn/4Q3pb/IpUVESMHdtY3r/+Nf172bLGfSwvf2v39tun+8Ree6W3TdfXp7cnH354Or6UeeTl98FcLr0NOyJi0qTGdX3iibRl8Fwu3XYRaYMq+fm++GK6PR95JB333HMREyemLYjnHX30ircBAND5ytufBADIsh490hDm8cfTZ3tFRPzpT43PIzv66PQ5ZrffnoZde+654vn97GcRp52WNkjxwAMRAwakwzfcMO0uWFA8fb6/oqJx2LJl6TMFJ02KOPLItKXf5mFOKdOUYqONGv8uK0vL0VbLrv/+d9q9+uriZ8rlzZiRNqASEfGJT6zcsvv0Sbv5Zb/8csQJJxQ/c7HpNmpP/nlvTYOYiIhddmksa96QIcXTrL9+cSB7111pC8tNA86Ixs92ZS1alD4jMaI4iMu74YY0RFtVH/tY68MXL07X+9hji4d/5jOtT9+05eB+/dJuW41/HHJIGrpeeGHEWWelr2HD0nXJPz+yNVttVdz/xhtpd++9W0779ttp97LL0ucy3n13xOTJ6bA99kiD9803L20eeUOHNv7dt2+6ba65Jt2///jHdPjXvpZ233037bYV6FVVpcveffe2lwEAdA0BIAAQn/1s+qP/5JPT7rHHpi3rLlkS8eyzabDx4Ydp4HDXXW3P59JL05Z+99gjbVghXwMuIg0XKipa1qDKB1Gbb552ly5NG0W4++40hLzoouIaiaVOU6qVeV++8ZJrronYf//Wx+drS5ZSU7KtZS9dmgattbXpNv3MZ9LalBddlL5KkQ8Xmweu+dqDw4c3DuvRo+35vPRSxIEHprW8rrsubRTjU5+KmDAhrRG4Ku65J+0ee2zaUEpT554b8atfpY1jNK25uDLK2/iGmw9ZmzcU01ZNxpXZN8rKIn7yk7SRlccfT9fxV79Kaw7W1ra9Ls33k8rKdNizz7acNl+LduDA9Dj86KO0cZ477kiPhaOPbmxwpr155DXfVoccku7fkyentTQnTIjYcst03ODBafell9JlNLfBBunwqqri4atTUxQA6BhuAQYACjV8vvzliG9+M+LJJ9PbC48+Og3+NtwwvaV1990ba5A1d/PNafh3wAER999fHP7l7bVXGlzkW6GNSIPCiMZWfP/nf9Iw45JL0ldrIUwp03SEfI3C5cvTbr621iOPRGy8cePrssvS2lZz5nTMcqdOjZg9O23t+PTT0xCmb9+Iv/89HZ8PGfPly/c3tcUWaTe/ffPuvz/t5m9ZbU++BuJVV6W3dW+/fRqgPftsy9u2S5VvsfbCC9MQsOnryCPTW5fzIWFHyuXSz+mee4pbMM7f8rqy84po3Pbf+U7aSnKPHhH77JPWoDv11HTce++VPt8ddkhD8TlzGvevRYvSIPGaa9Ll7bprelvuRhultWDvuiu9LT1/i25781iRCRPSGnuXXZa27pu/rTiisRXrf/yjcb6DBqW3bh9zTDpul13Smr9NQ7/87cEAQNdRAxAAiIED0+d57b57GvQdfnhaw662Nn2O2dVXp4HS7be3rEEUkQZCxx+f/j1sWMTFFxeP/+//Tmt6nXxyelvhIYekNfceeyyd55VXpreTPvVUxG9/m9YUrK1Na4E1ddJJ6bPx2psmX1NvdeXX9Z57IkaOTLfLHnukZa6oSIPTZ5+N+OlP0/Bqs83SoGV1bb11Ov/bbktrZ663XnqLcz6My4cr+ef2XX55xFe+UjyPior0NtSLL06DqK9+NX223sknp7UJd9qptLJMmJB2r7kmDXvmzIn4wQ/SYc1rF+bNnJkGwRMnRpx9dvG46dPTWmsHHND653TooWmZr7km4qCDSivjyvjBD9KQavfd0yD59dfbf05fa5pv+z33bNwPDj00rZ03aVJaIy7/bMlSfO97aYB24IER3/1uGrL96EfprbVf+1paY2/nndOaoCNHpn+/9FIaGuefgdnePFakrCwN/s89N+0/8MDGcd/6VjqfM85IA+rPfCbixhvT/fLWW9NQNH+MH3xwWhvyzTfTfww0taL9AwDoJF3dComXl5eXl5fX2vOaOTOSCy6IZMKEtNXOiEj23jttKXTRorbfN3ly4/StvW66qXHaa65pHF5REclJJ6WtyiZJJD/84Yrn8+67pU3TWhn/9Kd0/J/+lPY/8khja7hNp6uoSFvKzfefeWbjvN9/P5LZsxtbNY3/vwXdY45pbJl30aKWLaU2f+VbkG26TV9+OR12yy1p/623xv/X3h2zxBGEYQCeVFYJwSJVwFQWFhaCYCMR/4GFlaWdYK/Cdbb+ABsLGwsFsUxjkTKVTUSQQFSIoCiIRVKom+JzOcU7NUZRPp8HBvZml7292eWKl535qp6e5vcMDZVqYSG25+fjmO3t+P5SSjUxERVqS4lquFUVVX6vXn+5rJB7cNDcX0qpGo3r1zc6GpWA68/T083vKZeVnaemYvv791KdnMT27Gwcv7MTn8fHb/72ubnYt7LSfnz6++OYHz9ie3g4+usqwF+/3v4crq83++oqwDMz14+rn/G+vrjuq2Pf6v5sbkbf4mLrsW81Tr29MT7tfufAwM2qx1VVquXl6+cZHLw+XsfHzWq7dRsZKdXR0f3P0WhE/9UK0XXb2op9o6M3921sxJjV5+3uvln9eWkpKgfXx0xONitc3/V8aJqmaZr2NO1NVZXqgdkhAJDY6Wm8bVSvm/aYzs6ioEZXV/v12l6Ses24d++afX/+lPLrVxT7eKopyKXEG3Pv37cvAHJ+XsrxcbxN124sz85K2d2NNxQ7Oh52HRcXMZX148f7rW/4Uq2txVgODzf7Vldj+vuXLzFN/b5ajf3FRRTa+PCh9TT4f7G/H/ers7P1/t+/475++tT+vt51joc6OYn/iNsKfOzuRoGZhz5zAMDjEQACAPBqjI+XsrAQod/nzzGlfGwspuzu7f1/aAcA8BIJAAEAeDV+/oz16a5WyH37Ntaj7Ot7rqsCAHhaAkAAAF6dw8NSvn2LojXd3VG9FwAgKwEgAAAAACT2hEtWAwAAAADPTQAIAAAAAIkJAAEAAAAgMQEgAAAAACQmAAQAAACAxASAAAAAAJCYABAAAAAAEhMAAgAAAEBiAkAAAAAASEwACAAAAACJCQABAAAAIDEBIAAAAAAkJgAEAAAAgMQEgAAAAACQmAAQAAAAABITAAIAAABAYgJAAAAAAEhMAAgAAAAAiQkAAQAAACAxASAAAAAAJCYABAAAAIDEBIAAAAAAkJgAEAAAAAASEwACAAAAQGICQAAAAABITAAIAAAAAIkJAAEAAAAgMQEgAAAAACQmAAQAAACAxASAAAAAAJCYABAAAAAAEhMAAgAAAEBiAkAAAAAASEwACAAAAACJCQABAAAAILG/WTMoiz93V3cAAAAASUVORK5CYII=\"\u003e"
},
"metadata": {},
"responses": [],
"suggestions": [
{
"agent": null,
"question_name": "correction",
"score": null,
"type": null,
"value": "```json\n\u003chtml\u003e\n\u003cstyle\u003e\nbody {\n font-family: Arial, sans-serif;\n margin: 0;\n padding: 0;\n border-box;\n}\n\nheader {\n background: #ff0;\n text-align: center;\n padding: 20px;\n}\n\n#video-container {\n width: 70%;\n margin: auto;\n text-align: center;\n}\n\n#video-player {\n width: 100%;\n height: 300px;\n background: #f0f;\n}\n\n#chef-tips {\n margin: 40px;\n}\n\n#chef-tips p {\n color: #000;\n line-height: 1.6;\n}\n\nfooter {\n background: #ff0;\n text-align: center;\n padding: 20px;\n position: fixed;\n width: 100%;\n bottom: 0;\n}\n\u003c/style\u003e\n\u003cbody\u003e\n \u003cheader\u003e\n \u003ch1\u003eInter\u0027national\u003c/h1\u003e\n \u003c/header\u003e\n \u003cmain\u003e\n \u003csection id=\"video-container\"\u003e\n \u003cvideo id=\"video-player\" src=\"#\" controls\u003e\u003c/video\u003e\n \u003c/section\u003e\n \u003csection id=\"chef-tips\"\u003e\n \u003ch2\u003eChef\u0027s Tips\u003c/h2\u003e\n \u003cp\u003eEnjoy the tasty and healthy recipes shared by the best internationally recognized chefs. Discover the latest cooking trends and techniques.\u003c/p\u003e\n \u003c/section\u003e\n \u003c/main\u003e\n \u003cfooter\u003e\n \u003cp\u003e\u00a9 2022 Inter\u0027national. All rights reserved.\u003c/p\u003e\n \u003c/footer\u003e\n \u003c/body\u003e\n\u003c/html\u003e\n```"
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"accuracy": [],
"accuracy-suggestion": null,
"accuracy-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"correction": [],
"correction-suggestion": "```json\n\u003chtml\u003e\n\u003cstyle\u003e\nbody {\n font-family: Arial, sans-serif;\n margin: 0;\n padding: 0;\n border-box;\n}\n\nheader {\n background: #ff0;\n text-align: center;\n padding: 20px;\n}\n\n#video-container {\n width: 70%;\n margin: auto;\n text-align: center;\n}\n\n#video-player {\n width: 100%;\n height: 300px;\n background: #f0f;\n}\n\n#chef-tips {\n margin: 40px;\n}\n\n#chef-tips p {\n color: #000;\n line-height: 1.6;\n}\n\nfooter {\n background: #ff0;\n text-align: center;\n padding: 20px;\n position: fixed;\n width: 100%;\n bottom: 0;\n}\n\u003c/style\u003e\n\u003cbody\u003e\n \u003cheader\u003e\n \u003ch1\u003eInter\u0027national\u003c/h1\u003e\n \u003c/header\u003e\n \u003cmain\u003e\n \u003csection id=\"video-container\"\u003e\n \u003cvideo id=\"video-player\" src=\"#\" controls\u003e\u003c/video\u003e\n \u003c/section\u003e\n \u003csection id=\"chef-tips\"\u003e\n \u003ch2\u003eChef\u0027s Tips\u003c/h2\u003e\n \u003cp\u003eEnjoy the tasty and healthy recipes shared by the best internationally recognized chefs. Discover the latest cooking trends and techniques.\u003c/p\u003e\n \u003c/section\u003e\n \u003c/main\u003e\n \u003cfooter\u003e\n \u003cp\u003e\u00a9 2022 Inter\u0027national. All rights reserved.\u003c/p\u003e\n \u003c/footer\u003e\n \u003c/body\u003e\n\u003c/html\u003e\n```",
"correction-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"html_code": "```json\n\u003chtml\u003e\n\u003cstyle\u003e\nbody {\n font-family: Arial, sans-serif;\n margin: 0;\n padding: 0;\n border-box;\n}\n\nheader {\n background: #ff0;\n text-align: center;\n padding: 20px;\n}\n\n#video-container {\n width: 70%;\n margin: auto;\n text-align: center;\n}\n\n#video-player {\n width: 100%;\n height: 300px;\n background: #f0f;\n}\n\n#chef-tips {\n margin: 40px;\n}\n\n#chef-tips p {\n color: #000;\n line-height: 1.6;\n}\n\nfooter {\n background: #ff0;\n text-align: center;\n padding: 20px;\n position: fixed;\n width: 100%;\n bottom: 0;\n}\n\u003c/style\u003e\n\u003cbody\u003e\n \u003cheader\u003e\n \u003ch1\u003eInter\u0027national\u003c/h1\u003e\n \u003c/header\u003e\n \u003cmain\u003e\n \u003csection id=\"video-container\"\u003e\n \u003cvideo id=\"video-player\" src=\"#\" controls\u003e\u003c/video\u003e\n \u003c/section\u003e\n \u003csection id=\"chef-tips\"\u003e\n \u003ch2\u003eChef\u0027s Tips\u003c/h2\u003e\n \u003cp\u003eEnjoy the tasty and healthy recipes shared by the best internationally recognized chefs. Discover the latest cooking trends and techniques.\u003c/p\u003e\n \u003c/section\u003e\n \u003c/main\u003e\n \u003cfooter\u003e\n \u003cp\u003e\u00a9 2022 Inter\u0027national. All rights reserved.\u003c/p\u003e\n \u003c/footer\u003e\n \u003c/body\u003e\n\u003c/html\u003e\n```",
"image": "\u003cimg src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABQAAAALQCAYAAADPfd1WAAB/xElEQVR4nOzdeZxWdd0//vc1DCDgIIsOLlgJBoqJicKN4ZalpoXeZlpf03JJU2/3cskyl8rd8q4009JSbzUx09xSzCUjd1NTFPcCFZBlYAaGZWbO74/zu66ZaxbmAmYYOPN8Ph7X48xZrnM+51xnua7XfM755JIkkgAAAAAAMqmsqwsAAAAAAHQeASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGlXd1AQCA7uPVVyM++qh42NixEf36dU15utqiRRHPPls8rF+/dJtkxVtvRcyY0dify0VMmBBR3s2+hT73XERNTWN/RUXEDjt0XXkAgO4llySRdHUhAIDu4QtfiHjwweJhTz4ZMX78mln+lCkRf/xjxE9/umaW156nnorYaafiYZWVEbNmdU15VkVDQ8TNN0fU1kZ8+9stxx9zTMR11xUPmzEjYrPN1kz51habb14chA4bFvH2211XHgCge3ELMACQedOnR3z96xE779wygGTVPfVUGt5+85sR06Z1dWkAAGhLN7v5AgDobi69NOLMM7u6FNlSXR1x4okRv/99V5cEAIBSqAEIAGTaj3/c1SXInunThX8AAOsSNQABALrIkCEth33842u+HJ3p4osjzjijsT+Xi9h0064rDwBAdyQABADWKfPnRzz/fNqK6qc+1bktCH/4YdpycZ8+EaNGRQwcuOrzWr48bfG3tjZixIi0UYhPfCJdj+rqxulGj165+dbWRvzznxFLl0Zsu23Ehhuu3Hs//DBi5syIefPSbTloUMQmm6SNkXSEQYPS18qaPz/iP/+J+OCDiA02iNh669Xb/nn19RGvvZbWYtx66zRwzeVKf//s2en2mjkzndfAgelr2LCInj1Xv3wAAJ3BLcAAwFpl+PA0kMm/LrssIknS7jbbpGHSnnumjU+sv37EuHFpANbckCHp+5uGaxERU6c2znvnnVu+b9GiiFNPTd+/6abpsnbeOV3u5pun4xYtar3sCxcWlz2Xi7j77oirr47o1StiwoSIz38+4mMfS1tErq9v2QLyNtsU9x93XPH8vvSldPh996Xl6ts3ne8ee0RstFFaxttuW/E2njYt4vjj0/cOH56+f+LEdB6f/nS67jvs0HI+r7+elqF5GSMifvazxjKec07j8FNPbblN2mrlOEnSVpq/8IV0e3/60xH77puWL7/9v/WtiI8+anvd/u//ipfVv39EXV26zvvtl4Z1226bzneLLdJw8fvfTwPattTVRfzudxHbbZdum+22i9h773QeO+0UsdVWEYMHp2X78MO25wMA0FUEgADAWqWmprh/3ryIb3wjvY106tSW0z/7bMSYMRGTJhUPr61tf1nz5xf3//Of6byuvDKt6dXcjBnpuDFjIl56qeX4+vqWw37724j/+Z+Wwz/6KKK8PA3amtp66+L+5mHjwoURv/hFGgROmdJ6Gf/f/4s47bSW4yLS2pNbbRXxq1+1Pj7vhRfS+Zx6auOwZctW/J7WytxaWFpX13LY/PkRBx4Y8ZWvtN1S84wZ6fYcPjzizjtbn2bp0uL+6uqIxx6LGDs24p57WgbC1dURF14Yseuure8z9fURRx4ZccQRES+/3Poy8/P57W8jRo6MeOuttqcDAOgKAkAAYK128cURN9/c/nQnnFB6QNWaWbMidtst4o032p/2jTfS2mkzZrQ/7T33tD78iCPS7imnRDz9dONr111XPL8nnog46aT2l/uzn7UMTBcujPjsZ9t/b1NXXpmGgZ1p2bK0Rt2f/lTa9NXVaVj461+XNv2ee7YM/pp76qnW97PLL4+46abSlpMv23e/W/r0AABrggAQAFhnfO1rEeeem9bAa2727OKaY5tv3vZz7Cor01fTBjfOOadlSFRREXH22WkNsdaezXfJJSu/Dnlf/nLaHTIkvY05/+rbt/R57LFHxPnnR3zxi62Pv/764v6bbmq5jpWV6e3Av/xlxJlnpuvc3OTJabdPn3T61qbJz6uyMr0ddmWcd15ak7O5iop03XbZpfX3HXts+pzAUo0enX6eRx7Z+vhf/rK4f8mSiJ/8pOV0e+8dcdFFET/9aRouNnf33asXRgMAdDSNgAAAa72KirThhs02S/vPPDN9dl7zWzKnTUufZReRNt4RkT4DrmnoNWpU47i8f/4z4rrriocNG5beYrvxxmn/aadFHHRQcY2+X/4y4jvfSRvzWJHKyojf/Ca93ffhhyPefXf1W8J94IH0WXl5xx0Xcc01xdM0rwH49NMt5/O3v6W3reZttVVj7cS8999Pu5/8ZFpTcurUls8BPPXUNBBbWXPmpGFac4cfHnHVVY2B6LRpaWjafJ2OPTbi/vvbX87ZZxeHeXvumd7i3NTLL0c0NESUlTX29+lTvP8ceWR6q2/eySenjaY0v2V83rzGfQcAoKupAQgArPVuvrkx/ItIQ5nTT2853cKFqzb/Rx5pOezCC4sDnN69I37+85bT3Xpr+/P/3/9Ng8lNN02fZ3juuatWzrzTTisO/yLSULS55s84vPHGtLGL119PG9t47LHi8C8iYvfdW86nqmo1CtuO1hosGTs2Ddma1oYcOTKtWdfcAw+0/rzG5vM7//ziYV/7WsTQoS2nXbKk8e9x49LAc+7c9Nbrm25KbwluqqwsYv/9W86nvVuOAQDWJDUAAYC1Xmu3WQ4b1nJY8wYgStXac/+GD09rpzW1/vppbcSm4c706e3Pf7/9Vq1cbWntlt+PfazlsNZuQy0vT8O0fPC3YEH6jL/nnot4/PG0deHmmoZiHe2xx1oOO//8xlp4TW25ZcRRRxXXwItIa4e2dbt3RLq9ylv51rv11i2f47h0acvbsAcNSltc3nnntIbgtGnp9nrmmYh77414552W8y6lERoAgDVFAAgArNUqK9Maf821NmxVvfZay2Fjx5b23vYaAhk6dOWe61eKzTdvOay1wKwt06alNe/uvHPFLdvm5XKlz3tlvflmy2Hbbtv29Ntv33LYK6+kDbi0pbVwNKL0z6W2Ng1Gb7st4qGHSqvd16NHafMGAFgTBIAAwFpt/fVbH96rV8ctY9q0VX/vv/+94vFbbLHq825LW+Fn89qJrbnhhrYbwWhLa7XnOkprtQuHDGl7+taeq9deQyBtba/evVf8voiIjz5KaxC21kjJiqxMIAsA0Nl8NQEA1mrrrdf68I6slbY6jTXU1Kx4fEfX/osoLbhqzdVXtx3+7blnxAUXtH5LbmeGWYMGtRy2omf6zZzZclh7IWtb+1B76zV/ftrYTGvh39ChEUcfnQaqxx238vMGAFiT1AAEALq9kSNb3go7a9aKnytXqrbCp9WxquHn//5vy2HXXRdxyCGNQWV9fcctrxSbbx7x1FPFw159tbjRl6Zau2W5eYvEza1q+R9+uOXz/XbfPW2deNSoxmEffNDyvQJAAGBt4qsJANCt1NW1HNZagyJ/+1vLYe+9F/Hd76YB0F//GvH++xFJsuLldeStyqvjP/9p2djJMcdEfOtbxbUU3323/Xm1Fm4tX75q5Ro3ruWwc89tfbu++27Etde2HL711qu27PY8/HDLYTffXBz+RbT+HEMAgLWJABAAyLTmz3/78MPG1oLzreR+6Ust33fqqenz35r67ncjrrgi4oQTIj7/+fQ20BNOWPHy15bGIFp7VmHzmncNDREnndRyuuatK7f2TMCmrSG31vpwW7761dbLdcIJxc8HfOediK98peW0++8fseGGpS9vZbz1VsthL7xQ3P+Pf0T87nctp1uZbQAA0NkEgABApg0YUNxfXR0xenTETjs11hzbeeeIXXYpnm7GjIhPfSoN/G64IeLLX4744x9bzv9//qdTit3hRo5sOezllyOOOiri0Ucjfvvb9PbWBx5oOd38+cX9/fq1nObuu9PtOHJk+6FoU5tvHnHYYS2HX311xMc/HnHwwRFf+ELE8OEtw7eKirQ2ZmcZPbrlsOOPT2shPvpoxGmnpWVrTVVV55ULAGBleQYgAJBpn/50y1tfm/bX16e19C6+OGLChOLpZs9Oa/215fjjW94OuraqrIwYM6ZliHb99elrRV56Ka0dmL/1d6ONWm9xeMqUtLuyLR//8pfpe5s/b2/27IhJk9p+3//+b9vPCuwIe+0VceWVxcNmzIj49rfbf++0aS33JwCArqIGIACQaV//+orH52/z/cxn0qBr6NDS5nvYYa03qrE2u+mm9qepqGg9CG16u3B5edoCbltKeY5gU/37Rzz+eNoScSkqKiL+9KeII45YueWsrH32iTj88Pan23//lsNuu63DiwMAsMoEgADAGtPas+Pae0ZeW63otva+1oZNnJjeTtqaESOKnzM3enTEiy+moU9rDYNEpLXobrstvS24+fq0tvyePVufT6lKXc+Ils87bD7dqFFpC7t77NH6+7/2tfS5d60Fm3/5S3H/j34Uceyxrc+naYha6mc+dGi6jJtuSm9Fbk1FRfrZvP12xH//d+vTtLa81oZFtP7ZNC/btddG/Oxnrb9/6NCI22+PuOuudD9ravLkiJqaxv7mjcG0VSYAgM6QS5Jop+06AIB1X21txOuvp7eZfvzjaRjWtPXb1lRVRUydmjYcstlm6fPqNt00IpdbI0XuVB98kG6L2bPTIHSrrVYtlJo/P91Gs2en8/jkJzsm3Jo1K211+YMP0hqC22wTsfHGqz/fVVVbm26vd95J95vttuu8xkcAADqaABAAAAAAMswtwAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGlUckXV0G6DI77rhjVxcBAACANeC5557r6iJAl1EDEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMPKu7oAAACsvoaGhsjlcrF8+fJYunRpNDQ0RE1NTdTW1kZdXV1ERNTV1UV9fX306NEjysvLo2/fvtHQ0FAYVlFREWVlZdGvX78oLy+PsjL/KwYAyAIBIADAOiwf6i1cuDBqamoKAeDy5csjSZI237N06dJYvHhxYZqysrKYP39+lJeXR5IksfHGG8eGG24oBAQAyAABIADAOmjRokVRX18fNTU1UVVVFXV1dbF8+fKiacrL0696ZWVlkcvlokePHpHL5SJJkqivr48kSaKhoSHKyspi2bJlERGF7qJFi2LgwIECQACADBAAAgCsQ5YuXRrV1dVRVVUV1dXVkcvlor6+vjC+X79+kcvlory8PNZff/1Yf/31o6GhIXr06BF9+vSJ5cuXR0NDQyRJEkuXLo0ePXpEWVlZVFVVxfLly6OqqirWX3/9GDBgQPTs2bML1xQAgI4iAAQAWMslSRK5XC4++OCDWLx4cSxcuLAwvGfPntGzZ8/o1atXDB48ONZbb73o0aNH9OzZM5IkiR49ehTNq1evXoW/+/TpU/i7X79+UVdXF5tssknU1tbGgAED1si6AQDQ+QSAAABrsSRJYuHChVFVVRVz5swpDO/du3f06dMnBgwYEOuvv35h2OooLy+P8vLy1Z4PAABrFwEgAMBaqKGhIRoaGmL27Nkxb968WLp0aURE5HK5GDBgQAwePDgqKioiIjynDwCAFRIAAgCsZRoaGqK2tjZmzpwZixYtKjTuscEGG0SfPn1i4MCBsd566wn+AAAoiQAQAGAtk3/WX3V1dURE9OzZM/r37x+VlZXRt2/fLi4dAADrGgEgAMBaZObMmbFgwYJYsmRJRESst956seGGG8bgwYNbNOgBAAClEAACAKwl5syZE/PmzSuEfxtttFEMGDAg+vfv38UlAwBgXSYABABYC8yePTs++OCDaGhoiIiIgQMHxoYbbhh9+vTp4pIBALCuEwACAHSx+fPnx/Tp0wv9+ef9rbfeepHL5bqwZAAAZIEAEACgC9XW1sYHH3xQ6K+oqIgtt9wykiTRyi8AAB1CAAgA0EXq6uriww8/jPr6+ohIG/zYYostIpfLqfkHAECH8W9lAIAukCRJVFVVxdKlS2P58uVRVlYWW2yxRfTs2bOriwYAQMYIAAEAusD8+fNj/vz5sXjx4oiI2GSTTaJXr15dXCoAALJIAAgAsIYtXLgwqqqqYuHChRERMXjw4BgwYECUl3s6CwAAHU8ACACwBjU0NERtbW3Mnz8/ItJGPyoqKmK99dbr4pIBAJBVAkAAgDWorq4uZs6cGRERvXr1ioEDB8bAgQO7uFQAAGSZABAAYA1JkiQWLFhQaOG3vr4++vfvH2VlvpIBANB5fNsEAFhDli5dGlVVVbF8+fKISBv+0OovAACdTQAIALCGLFmypNDwR0VFRfTt21ftPwAAOp2m5gAA1oClS5fG4sWLC4Ff3759o6KiootLBQBAd+BfzgAAa8CyZcti/vz50dDQED179oxNNtmkq4sEAEA3IQAEAFgDqqurY8mSJRER0adPn6ivr+/iEgEA0F0IAAEAOtny5ctj2bJlhf4+ffpEr169urBEAAB0JwJAAIBOliRJofGPXr16RZ8+fbq4RAAAdCcCQACATrZo0aLI5XIREVFfXx99+/bt4hIBANCdCAABADpZjx49IkmSiIgYMGBA9OjRo4tLBABAdyIABADoZAsXLoyGhoaIiOjZs2eUl5d3cYkAAOhOBIAAAJ1s6dKlUV9fH7169YoBAwZ0dXEAAOhmBIAAAJ0sf8tvQ0NDoSYgAACsKQJAAIBOVlZWFrlcLsrKyqK+vr6riwMAQDcjAAQA6GT19fWFRkB69+7dxaUBAKC7EQACAHSyurq6iIhYtmyZW4ABAFjjBIAAAJ1s8eLFERHRt2/f6NmzZxeXBgCA7kYACADQyfKNgESEABAAgDVOAAgA0MnKy8sjIn0WYG1tbReXBgCA7kYACADQyXr16hUREUmSxPLly7u4NAAAdDcCQACATpbL5SIibQxkyZIlGgIBAGCNEgACAHSyvn37Ro8ePaKhoSGWL18eSZJ0dZEAAOhGBIAAAJ1s/fXXL9QCXLJkSeFvAABYEwSAAACdrKGhIcrK0q9d9fX1ngMIAMAaJQAEAOhkvXv3jt69e0dE2hBIfX19F5cIAIDuRAAIANDJevbsGT179oyIiOXLl8eSJUu6uEQAAHQnAkAAgE6Wy+Wib9++UV5eHsuWLYuqqiohIAAAa4wAEABgDRgwYED06dMnIiKqq6tj2bJlWgMGAGCNEAACAKwBPXv2jAEDBkQul4v6+vpYsGCB1oABAFgjBIAAAGtAWVlZ9O7dO3r16hVJkkRtbW3U1dV1dbEAAOgGBIAAAGvIeuutF2Vl6devJUuWRHV1dReXCACA7kAACACwhvTs2TP69+8fEWlrwLNmzVILEACATicABABYQ8rKymLAgAHRr1+/iEhrAc6ZMyeWL1/exSUDACDLBIAAAGtQnz59YuDAgVFeXh719fUxd+7cqKmp0SIwAACdRgAIALAG9ejRIwYOHBh9+vSJiLQW4IIFC9wKDABApxEAAgCsYT179oyNN944+vbtGxERc+fOjY8++kgtQAAAOoUAEABgDcvlctG/f//YaKONory8PCIiqqqqoqampotLBgBAFgkAAQC6SP/+/WODDTaIsrKyWLp0acyePVsICABAhxMAAgB0kV69esWgQYOiV69e0dDQEFVVVTFz5sxYunRpVxdtleVbNl60aJFbmgEA1hICQACALrT++uvHZpttVmgUZMGCBTFt2rRYsmRJF5ds5dXX18cHH3wQH374YcydOzfq6+u7ukgAAIQAEACgS5WVlcWAAQNis802i4qKisjlclFfXx+zZs2K2trari5eyRYvXhwffPBBVFVVxfLly6OmpmadrskIAJAl5V1dAAAA0pqA9fX1UVdXF7W1tTF//vyIiBg8eHD07t07evbs2cUlbF2SJFFTUxOzZ8+OqqqqiIioqKiIDTfcMPr169e1hQMAICIEgAAAa4UePXrEoEGDonfv3jFjxoxYtGhRzJkzJ6qqqmKjjTaKDTfcMHr16tXVxSxIkiTq6urio48+iqqqqkJtvz59+sSQIUNi/fXX7+ISAgCQJwAEAFiL9OvXLzbddNOYNWtWVFdXR11dXcydOzd69OgR/fr1i379+kUul+vqYkZtbW3MmTMnPvroo4hIb2Veb731YtNNN40NNtigi0sHAEBTAkAAgLVMv379YqONNor6+vqoqamJZcuWxYwZM6J3796x0UYbxfrrr99lt9fW1dXFvHnzYsGCBbF48eKIiCgvL48NNtggBg0apOYfAMBaSAAIALCWKSsriw022CA22GCDmDt3bsyePTuWLFkSy5Yti9mzZ8eCBQti0KBB0a9fvygvL4/y8vJOqxWYJEnkcrmoq6uL6urqmDVrVixZsqSohd9NNtkkBgwYsFbdogwAQCMBIADAWmyDDTaIhoaGqK6ujgULFsSyZcti2bJlsXjx4sItwfmGQjpaPnSsra2NRYsWRU1NTSxfvjwi0mcWrrfeerHJJpu45RcAYC0nAAQAWIuVl5fHRhttFIMHD465c+dGTU1NVFdXx/Lly2PhwoWxcOHC+PDDD2PAgAGFWoM9e/aMurq6yOVy0aNHj3aXka/llyRJLFmyJHr16hULFiyIqqqqqK6ujoaGhkiSJJIkiYj0FuWKiooYMGBA9O3bt7M3AQAAq0kACACwDigrKyvc9tuvX79YtGhRLFq0KOrq6qK+vj6qqqqiqqoqIiLWX3/96NOnT/Tr1y+SJImePXtGWVlZVFRURE1NTZSVlUWvXr1i6dKlsWTJkoiIqK6ujvr6+qirq4uampro1atXLFu2rKgMvXr1igEDBsTAgQOjT58+JYWLAAB0PQEgAMA6okePHtG3b9/o27dvLF26NGpqaqKmpiYWLlxYFNblh3/00UdRXl4evXv3jkWLFkWPHj2id+/e0bt371i4cGGUl5cXAsR8DcC8ZcuWRXl5efTp0yc22GCDyOVy0bt376ioqIiysrKuWH0AAFaRABAAYB2UD/IGDx4cS5cujQULFhSeEZhvtKPpKyKivr4+Fi9eXGi9t2lDHvnwr2fPntGzZ88oLy+P/v37x4ABAwo1CAEAWDcJAAEA1nG9e/eOysrKqKysjPr6+kKDHUmSFBrtyN/Wmw8Hc7lc9OrVK+rq6qKhoSH69+8fZWVlhVp+y5Yt83w/AICMEAACAGRIjx49on///tG/f//Vmk95ua+JAABZ4V4OAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAKRbS5JEv379+vXr169fv379+vV3g37oznKJI4JubIcddujqIgAAALAGPP/8811dBOgyagDS7eVyOV1dXV1dXV1dXV1dXd1u0IXuSg1AujU1AAEAALoHNQDpztQABAAAAIAMEwACAAAAQIYJAOnWuroVKv369evXr1+/fv369evXv2b6oTvzDEC6tTFjxnR1EQAAAFgDXnjhha4uAnQZNQABAAAAIMMEgAAAAACQYQJAAAAAAMiw8q4uAHQlj8AEAAAAsk4NQAAAAADIMDUA6dbUAAQAAACyTg1AAAAAAMgwASAAAAAAZJgAkG4tfwuwrq6urq6urq6urq6ubra70J3lEkcC3dh2223X1UUAAABgDXjppZe6ugjQZdQABAAAAIAM0wow3ZoKsAAAAEDWqQEIAAAAABmmBiDdmhqAAAAAQNapAQgAAAAAGaYGIN2aGoAAAABA1qkBCAAAAAAZpgYg3ZoagAAAAEDWqQEIAAAAABmmBiDdmhqAAAAAQNYJAOnWmgaAuVxOv379+vXr169fv379+vVntB+6s1zS9IiAbmbrrbfu6iIAAACwBrz22mtdXQToMmoA0q3JvwEAAICsEwDSrQkAAQAAgKzTCjAAAAAAZJgagHRragACAAAAWacGIAAAAABkmBqAdGtqAAIAAABZJwCkWxMAAgAAAFknAKRbEwACAAAAWecZgAAAAACQYWoA0q2pAQgAAABknQCQbk0ACAAAAGSdAJBuTQAIAAAAZJ1nAAIAAABAhqkBSLemBiAAAACQdQJAujUBIAAAAJB1AkC6NQEgAAAAkHWeAQgAAAAAGaYGIN2aGoAAAABA1gkA6dYEgAAAAEDWCQDp1gSAAAAAQNZ5BiDdWvMAUL9+/fr169evX79+/d2zf+DAgZEkSQwcODD23HPPGD58+FpVviz39+jRI7beeuvYaqutokePHp22POjOcokjgm5sk0026dD5DRkyJCIiZs2a1aHzBQAAOs+gQYNixx13jIceeigGDRoUw4YNi0GDBsW8efPiueee6+ritTB8+PAYOHBgof+dd96JefPmRUQUyp83f/78ePvtt9d4GVfG1ltvHZtvvnlEREyfPj1ee+21TlnOhx9+2CnzhXWBW4Dp1jo6/06SJLbbbrv48MMP44033oilS5d26Pxb86lPfSp+/vOfx4gRI2LZsmXx1FNPxWGHHRb19fWtTv/Vr341Tj/99Nh0001jwYIF8fvf/z4uvvjiwvif/OQn8eUvfzkqKirio48+itNPPz0efvjhTl8PAADoCoMGDYoddtghkiSJJEli7ty5MXfu3Bg+fHgMHz489txzz3jooYe6uphFBgwYEAMHDiyEfvmyN/970KBBRf1rq7bKD3QcASDdWmcEgEmSxMYbbxwbbbRRvPHGGzF9+vQOXUZT/fr1i7vuuit69OgRF198cWyyySZx5JFHxj333BP77LNPi+l33HHHuPLKK2P69Onxwx/+MPbaa6845ZRTYtmyZXHFFVfEd7/73TjqqKPi0Ucfjb/+9a9x3HHHxU033RRf+MIX4sUXX+y09QAAgK7QNPyLKP598NZbb0WSJDF8+PAYNmzYWleLbt68efHss8+2GJ4PMCMixo4dGxFr/62w06ZNK5TxjTfeWOvLC+sizwCkW2t+oV/dblP551iMGTMm1l9//U5Z3qGHHhrrr79+nHbaafGLX/wivve978U///nPGDNmTAwYMCCSJIk777wz7rnnnoiIOPvssyMiYpdddonrrrsuvvKVr0RtbW1861vfioiI4447Lj766KP46le/Gtddd11MnDgxcrlcfPe73+2U8uvq6urq6urq6uquye7AgQMLgd7w4cNjxx13jKaSJCm6hfatt96K+fPnx/Dhw9eK8me1W1dXF6+99lpMnTo16uvrO2050J2pAUi3liRJh15Ums4vb/DgwbHTTjvFW2+9FW+//XaHLu9Tn/pURET86U9/ioaGhohI/3s2ZsyYGD16dDz22GMxevTo6NGjRzQ0NMTHPvaxmDt3bixevLhQvo8++iiGDh0aDQ0N0a9fv/j73/9eWI/p06dHkiQxcuTIwvy7+suBrq6urq6urq6u7qp2Bw4cWPR8vPy4pv0NDQ0xfPjwmDdvXsybNy/efPPNGDt2bAwYMCDmzZu3VqzHW2+9Vfi7I6br6m6PHj1i5MiREZH+nmloaOi05UF3JQCENWTLLbcs3Bacf1ZHR8yzoaGh6Hl/r776akREbLbZZhERse2220ZZWVrZd8MNN2zx4Nv//Oc/8bGPfSw+8YlPRC6Xa9GASU1NTaEGIwAAdBf5EHBtVGq51tbyNzdy5MhCIyAREVOnTu3C0kA2CQDp1jr6v0BN/7O2ovEdtdz11luvMN+85cuXR0TE+++/H0mSRE1NTWFceXl59OjRo2j6+vr6aGhoiL59+xamaT6+urraf8wAAFjnrcz39Xxtv/xr4MCBhWfr0bGa11D02wM6nmcA0q2tyWrlb731Vjz55JMdetvAnDlzoqysLAYOHFgYPmTIkIiI+Mc//tFi+sWLF8fgwYOLhg8aNCiqqqrinXfeiYiILbbYomh8375948033+zU7aWrq6urq6urq6u7JrttaT4+SdLvyxGx1tz+G5E+yzDfwm9HTNfV3ddffz2mT58e//nPf4oaBFlTnzt0B2oA0q0lSec/A3DOnDkxbdq0qK6u7rDl5LvPPvts7LrrrvG5z30ubr/99oiI+NznPhe1tbVRW1vbYvp///vfse2220ZFRUUsXLgwIiJGjBgRr776aixatCiWLVsWW265ZWE9Ro4cGb169Yrnn3/eMwB1dXV1dXV1dXXX+e7cuXML/RHpI3WaSpL0GYBvvvlm4fl5+efRzZkzp8vLn+/mGyWZM2dOh0zX1d2GhoZ45ZVXoqnOWh50V2oAQiepq6uLqVOnxnPPPVcI/zraVVddFUmSxOWXXx6f+9zn4jvf+U5st9128eyzzxamueeee+LBBx+MiIirr746crlc3H///fFf//Vfccstt0Tv3r3jjjvuiIiIyZMnR2VlZfz85z+P//qv/4pJkyZFkqQtCQMAwLpu3rx58dZbbxVezzzzTJvTRKQB4ZZbblnop3P06NEjRo0aFaNGjYoePXp0dXEgk3KJGJxuLH/rbEfZeOON49Of/nR88MEHMW3atFi6dGmHzr81X/3qV+NXv/pV5HK5iIj48MMPY/vtty8s+7333ouePXsWGgW59tpr46CDDiq8/6GHHoqvfvWrERHRs2fPeOqppwotoyVJEt///vfjV7/6VaevBwAAdIVBgwbFuHHjIiLiL3/5S2F4PvxrPnxtMG7cuBg0aFChkY+33nqr8PegQYMK5c5P01rQuTYZNWpUfOxjH4uItJHCzmoEZP78+Z0yX1gXCADp1gYMGFAIziLSwGt1+ocMGRK5XC5mzpzZIfMrtb93796x1157xdSpU+Ptt99ud/rBgwfH7rvvHo888kjRRTA/fsstt4xRo0bF/fffH3V1dZ1efv369evXr1+/fv36u7I/HwI+8MADkcvlCgHb3Llzi+6uWVvKu+WWWxae7ZfL5QoBYJIkMXjw4EIAmCRJzJ8/v1CDcW0pf/P+fACYJElMnz69EAB29PIEgHRnAkC6tQEDBnR1EQAAgLVAvrZcPgxcF2rOZUWPHj1iq622ioi0QZD6+vpOWU5VVVWnzBfWBQJAurUNNtigq4sAAADAGrBgwYKuLgJ0Ga0A063JvwEAAICs0wowAAAAAGSYGoB0a2oAAgAAAFmnBiDdWj4A1NXV1dXV1dXV1dXV1c12F7ozjYDQra2//vpdXQQAAADWgJqamq4uAnQZNQDp1rr6P1C6urq6urq6urq6urq6a6YL3ZkagHRr/fr16+oiAAAAsAYsWrSoq4sAXUYjIHRr8m8AAAAg6wSAdGsCQAAAACDrPAMQAAAAADJMDUC6NTUAAQAAgKxTAxAAAAAAMkwNQLo1NQABAACArFMDEAAAAAAyTA1AujU1AAEAAICsEwDCGiJsBAAAKJbL5bq6CNAtCABhNWy++eax1VZbRa9evbq6KAAAAJm0bNmyeP3112P69OldXRRYZ3kGIN1ae7Xy2hsv/AMAAOhcvXr1iq222mq1f79Bd6YGIN2eiwgAAMC6we83WDVqAMJqeO2112LZsmVdXQwAAIDMWrZsWbz22mtdXQxYp+US8TjdWHm5SrAAAADdQV1dXVcXAbqMGoAAAAAAkGECQAAAAADIMAEg3Vr+DnhdXV1dXV1dXV1dXV3dbHehO/MMQLq1Hj16dHURAAAAWAPq6+u7ugjQZdQABAAAAIAMEwACAAAAQIYJAOnW2ns2hPHGG2+88cYbb7zxxhtvvPHZGA/dmWcA0q2VlcnAAQAAuoOGhoauLgJ0GekHAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhpV3dQGgK2kEGwAAAMg6NQDp9nK5nK6urq6urq6urq6urm436EJ3lUtUgaIbcxEAAADoHsQfdGdqANLtdfV/oHR1dXV1dXV1dXV1dXXXTBe6KzUA6dZcBAAAALoH8QfdmRqAAAAAAJBhWgGmW/MfIAAAACDr1AAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDyru6AKuipqYmnn766XjllVfixRdfjAULFsTWW28dW2+9deyxxx6x6aabtnhPVVVV/N///V+h//Of/3yMHDmy08s6e/bsuPfee+PZZ5+NDz/8MLbaaqsYMWJE9O3bN+bOnVuY7tvf/naUl3fsx/H444/HK6+8stLv22+//aJ///5x8803F4Z97nOfi6222qojiwcAAADAGpBLkiTp6kKsjMcffzwOPfTQmDFjRpvT/PznP4/jjjuuKFB7/fXXY+utty7033TTTXHooYd2aln//e9/xx577BHvvPNOi3E77rhjPPfcc4X+mpqa6NevX4cu//jjj49f/epXK/2+Bx98MLbYYosYMWJEYdjvf//7+MY3vtGRxQMAAABgDVinbgH+/ve/H7vvvvsKw7+IiJNOOim+8pWvRENDwxoqWetuvvnmVsO/XXbZJXK5XBeUCAAAAIDuZp25Bfjhhx+OCy+8sGjYqFGjYrfddovZs2fHH//4x6Jxd999d1x77bVx7LHHrsliFnnzzTeL+h955JHYYostor6+vkXtw84IBEePHh0TJ04sGvbqq68WhZITJkyIQYMGFU2z0UYbRe/evYtqAG6wwQYdXj4AAAAAOt86cQvw8uXL41Of+lS88cYbhWE//vGP46yzzooePXpERER1dXUcf/zxRc+tGzZsWLz55ptRVlbWJbcAf/azn43HHnssItKw8tVXXy2MO/zww+P3v/99RERUVFTEwoULO7UseT/60Y/ihz/8YaH/+eefjzFjxqyRZQMAAACw5q0TtwDfeuutReHfxIkT4/vf/34h/ItIQ7QrrrgiKioqCv3bbrttUUMbzf3jH/+Ir3/96zFkyJDo379/fOlLX4o//OEPbU4/derUOOqoo2KbbbaJXC4X22yzTRxxxBExZcqUoumuu+662G+//eL5558vDJs+fXrst99+sd9++8W0adNihx12KIwbP3580ftnzpwZZ555ZowbNy769+8fuVwuhgwZEjvvvHP86le/ikWLFrWzxVbfhx9+WCjvfvvtF48++mhh3B133FEYftppp8XSpUvj8ssvj8997nORy+Viu+22i5NOOqnVbZ8kSdx7772x7777xvDhwyOXy0Uul4vhw4fHd7/73fjnP//Z6esGAAAA0J2sE7cANw+Fzj777Fanq6ysjEceeSR69+4do0aNKgoIm/v5z38ezz77bNGw++67L+67776YNm1aUS25iIjrr78+jjrqqKJhU6dOjalTp8bvfve7uOCCC+IHP/hB5HK5+Ne//hX33HNP0bTV1dWFYWeddVaMGTOmEFY2DQBfeeWV2HbbbVuUd/bs2TF79uyYMmVKPP300/G73/2uzXXrCDU1NUXr8JWvfKXw95tvvlkYN3To0Jg6dWo8+OCDhfEvv/xyvPzyy/GHP/whHnroodhuu+0iIg3/vvnNb8ZNN93UYnnvvPNOXHHFFXHFFVfEq6++GqNGjeqsVQMAAADoVtaJGoDTpk0r6m8tIMvbcccdY9ttt11h+BcRhfCvoqKiEMTlnXvuuTFnzpxC/8svv9wi/Gvuhz/8Yfz5z39e4TRNTZgwIRYuXBgLFy6MCy64oDD8yCOPLJpuzJgxsc8++xSV8fe//32LZx52lRkzZhSFf03Nnj07vvGNb0R9fX1ERPzlL38pCv8qKipin332aRH2HXbYYbFs2bLOKzQAAABAN7JOBID/+te/Cn9XVFREv379OmS+1157bcyfPz/mzZvX4nmATz31VOHvk08+uWjc7bffHsuWLYsXXnihKLw6++yzo76+Ps4///x49913i56tN3r06Hj33Xfj3XffjbFjx7ZanpqamqJaiVdffXU8//zzcf/998fMmTMLjXKMGTOm6PbirlZRUREPPvhgNDQ0xH/+85+i9Xv55Zfjtttui4iIv/71r0Xvqaqqivvvvz9effXVuP766yMircW5ySabFD0vEQAAAIBVt04EgDNmzCj83adPnw6Z58SJE+Poo4+OHj16RHl5eZxwwglF4+fNmxcREVVVVYWGPCIi9t9//zjooIOiZ8+esf3228cpp5xSGDd16tR49dVXY+DAgfGJT3yiKKisqKiIT3ziE/GJT3wievbs2WqZmtdaPP744+Nb3/pWTJo0KRYsWBBPP/10LF26NJ5//vkWLSJ3pQsvvDD22muvyOVysfnmmxcaN8l7/PHHI6K4JeHq6uoYO3ZsXHbZZfH000/H17/+9Zg9e3bMmjUr7r333th+++3X6DoAAAAAZNU68QzAESNGFBoBmT17dtTX17d7i2978s+lyxs0aFBR/9KlSyMi4t133y0aPmfOnDjrrLMK/R988EHR+OnTp8fo0aNXqUx9+vSJPfbYIx555JHCsN/+9rfx29/+NiLSmn8HHHBAHH300TFkyJBVWkZn+MIXvlDUv/XWWxd9Zq+//npEpK0iN/XCCy/ECy+8EBFpQDpx4sQ47LDDWswPAAAAgFW3TtQA3GabbYr6Z82a1ea0jz32WNx8881Fz/Brzcc+9rGi/rZq5b333ntF/VOmTIlLLrmk8GreoMW///3vFS63PTfddFMMGzas1XEvvPBCnHPOObHxxhvHXXfdtVrL6UiDBw9uMWzo0KGFv/Mh6s477xwXX3xxq/Oorq6OW265JfbZZ5/Ya6+9ora2tnMKCwAAANDNrBMB4NZbb13Uv6IGMH7wgx/EYYcdFhtttFF87nOfi/fff7/V6ZrfSpzL5VqdrnkDIcOGDYuJEye2+Ro4cGApq9SmTTfdNKZOnRq33npr7L333m1Od8ABB0RVVdVqLaujzJ07t8WwpgFs0zDwzDPPjBdeeCFOP/30ouFNTZ48OS677LKOLygAAABAN7RO3AK87777Fj3z7sc//nEceuihLcK2++67L6ZMmVLof+WVV1b7VtmPf/zjRf2f/exn4ze/+U2hf/HixfH+++/HFltsEeXlq7856+rq4oMPPoiNNtoo7r777kiSJJ599tmYPHly/PrXv47Zs2cXpn3llVdi5513Xu1lrq7HHnssttxyy0J/VVVVvPzyy4X+kSNHFv5esGBBJEkShx12WFx66aXxzjvvxJQpU+IPf/hD3HfffYXp8s8NBAAAAGD1rBM1ACdMmBCHHXZYoX/27NkxevToeOihh2LBggVRVVUVv/nNb+L//b//V/S+U089dbVDuU984hNRWVlZ6L/99ttj/vz5hf6TTjopRowYET179oztttsuXnrppVVe1g033BA9e/aMYcOGxec///n4+c9/Huutt17ssssuccEFF8T5559fNH2vXr1WeVkd6fvf/3688847ERHR0NDQopw77LBD1NfXx/Dhw2PAgAGxww47xIQJE6KqqiqGDRsWhx12WNx9991FtS2XL1++RtcBAAAAIKvWiRqAEREXX3xx0fP2ZsyYscJbZEeMGBHf/va3V3u5PXv2jHPOOSdOPPHEiEifVTd69Og49NBD47XXXou77767MO2SJUtaPK9wZTRfnzPOOCOeffbZGD9+fEydOjVuv/32ovGr2thIR5s9e3Z8+tOfjj322CPefffdotp/lZWVceSRR0aPHj3iS1/6Uvz85z+PiHQ7/td//VcceOCBsd5668VDDz0U1dXVhfdNnDhxja8HAAAAQBatMwHgpptuGq+88koccsghRQFTayorK+Ohhx5a7efx5X3rW9+KJ598Mm655ZaISMPH1hqz+P3vf79aNQ433XTTuPHGG+Mb3/hGYdikSZNi0qRJLaa9++67Y7311lvlZXW06urqojA071e/+lX069cvIiLOO++8ePLJJ+PZZ5+NiIg33ngjLrroohbvGTt2bPzP//xP5xYYAAAAoJtYJ24Bzttmm23imWeeiXPPPTdGjBjRYnxFRUX85Cc/iTfeeKPFs/uaN/LRo0ePov6ysrI2+9dbb724+eab42c/+1mrLfROnDgxXnrppRg/fnyby2g+/7Ycdthh8eijj7ZZu3H33XePKVOmxH777VfS/JprXo62Gj9pPnxF5Z86dWrsvvvuRcOGDRsWTz75ZHz5y18uDBs4cGA89NBDcd5557VoXCUi/fx+/OMfx8MPPxx9+/Ztb1UAAAAAKEEuSZKkqwuxqhYtWhSvvfZa1NfXF57V11ag1ZEWLlwYr732Wqy33nrxsY99rMNqGja3YMGC+OCDD2Lu3Lmx4YYbxuabb16oTdeVLrroojj77LML/QsXLoyKioqYN29evP766zFs2LDYeOONVziPurq6mDlzZsyYMSPKyspi6NChsfHGG5cclAIAAABQmnU6AKRrtBUAAgAAALD2Ud0KAAAAADJMAAgAAAAAGbbOtALM2mPLLbeML37xi4X+nj17dmFpAAAAAFgRzwAEAAAAgAxzCzAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEAAAAAAyTAAIAAAAABkmAAQAAACADBMAAgAAAECGCQABAAAAIMMEgAAAAACQYQJAAAAAAMgwASAAAAAAZJgAEACgG/voo49ixowZUVdX19VFAQCgk6wTAeAOO+wQuVyuzddxxx1X8rzOP//8yOVyUVtb22nlTZIkJk2aFP/85z8jImLJkiWRy+Xi3HPP7fBl1dbWxkUXXRRLlizpsHk2L//a5vLLL49cLhfz589vdfwFF1wQuVwuFi9e3OllGTduXHzuc58r9D///PNxxx13FPp33XXX2HnnnTu9HB3h8ccfj1wuFw899FBXF2WF1uTnW6qnnnoqcrlc3Hvvva2Or62tjVwuF+eff36HLrf58f/oo49GLpeLhx9+uMOX0xnlb03zY6i5VdlPO+M8uTqan2OXLVsWuVwuzjnnnDVelqbnsAceeCByuVw88cQTa7wca4s1dR5s7zrWno46Dy5evDh23nnnqKysjM033zz+/ve/r9b8IiJmzZrV6ne14cOHx/HHHx/vvvtu0fSf/exnY6eddlrt5a6Nmh/rnXWOXrhwYeRyubjwwgtXuWwdpb1z+Jry4x//OHK5XNTU1KzS+I6yNlx/9t1339hhhx26bPkr0l7ZOmM/7ajjcE3tQwAdaZ0IACMiKioq4rzzzmv1tc8++5Q8nyFDhsSYMWOirKzzVv2JJ56Igw8+OD766KOi4Q0NDR2+rCuuuCLOPvvsqK+v77B5tlX+tUV+OyZJ0ur4/PC2xnek+vr6QnmWLVsWO+64Y7z00kuF8Q0NDR362XSmioqKGD9+fPTv37+ri7JCa/LzXVnt7ZMdfQ5ofvx31nI6a77NtXYMNbcq+2lnnCdXR/NzbFlZWYwfPz423XTTNV6WpuewvLXx2FpT1tR5sL3rWHs66jz4wAMPxJQpU+LII4+Me+65J3bcccfVml/TMo0dOzbOO++8OPfcc+P444+P4cOHx69+9asYNmxY/Pvf/y5Mv/XWW8c222yz2stdGzU/1temc3RnfNcr5Ry+prR3jG2yySYxfvz4Tv09ELF2XH8aGhrW2tq97ZWtM/bTjjoO19Q+BNCRyru6AKXaaKONOqQG3bHHHhvHHntsB5SobZ39I7mzl7Umy8/aY8yYMfHkk092dTFYCd3xWF2V/XRt207Ny1NeXu7YW0t0t/Pg7NmzIyLi9NNPj6222qpD573zzju3+N52zTXXxHHHHRdHHnlkTJ48OcrKyuLqq6/u0OWuTda2c09Ta3PZ1oSjjjoqjjrqqE5fTnffzqtrbd5+a2ofAuhImfqXxXPPPRfjxo2LRx55JL70pS9FLpeLkSNHxi9/+cvCf3uuvfbaGDduXCxdurTwvjvuuCPGjRtXmP773/9+oar+aaedFrvuumssW7asaFnHHXdcfPWrX21Rhqeeeiq+/e1vF6Y588wzC+OqqqriW9/6VvTv3z8233zzOPnkk2P58uWF8TU1NXHyySfH8OHDo3///rHXXnvFCy+80Ob6Xn311XHVVVdFRHqr6R/+8IeIiPjzn/8cn/vc56J///6Ry+Vihx12iPvuu6/wvtra2jjxxBNjyJAhkcvlYty4cfG73/2uzfKv7DbIa68cxx13XFxwwQVx2WWXxciRI6N///6x3377xXvvvVc0n+uvvz523nnn6N+/f3z5y1+O999/v81lNnXPPfcUbh//7Gc/G3/729+Kxr/99tvx5S9/Ofr37x9DhgyJb3zjGzFr1qzC+OXLl8cVV1xR2DdyuVzsu+++8frrr7dYVv42qoh0H9t1110L4+rr6+PSSy8tfK5f/vKX44MPPoiIiJtvvjnGjRsXL7/8ctH8rrvuuhg3blzMnTu3xbKuueaa+O///u+4/PLLo3///rHvvvtGTU1NNDQ0xC9/+cvYbrvtCtv71ltvLXpvdXV1fPe7342RI0fGkCFD4tRTT41XX301IiJeeOGFGDduXDz11FMRkX4+Z599dpx11lkxZMiQGD58eJx11lmxaNGionk+/PDDseuuuxaOn3PPPbdoX3n33Xdjv/32i/79+xf266effrqNT23F+2dT7X2+7e1/bW3HUo7DJ554Ir785S9HLpeLnXfeOaZMmdLm+jQ1c+bMOPjgg6N///4xcuTIuPLKKyNJkqirq4tdd901TjzxxKLp8/vV5Zdf3mJebR3/ERGvvfZafOELX4hcLhfbbbddXHPNNUXvXdlzTXvlzytlH7zqqqti5MiRhf3l7LPPjqVLl67wGGqqtf10ReeRtrZTe8d/a/vH3/72t3avMRER//rXv+Lggw+OzTffPHK5XGy++eZx+eWXR319favn2Lq6uhg3blxREPLPf/4z9t1330L5DjvssMJ5IyI9R3zpS1+K++67L3baaafCsfLoo48Wba/2joO2vPLKK60ee02vs809/fTTMW7cuLjxxhtjyJAhscMOOxRu22rvPBER8b//+7+Fc/3Xv/71+Mtf/lIYt2TJkrjwwgsL+84OO+wQN954Y9H7lyxZEueee25ss8020b9///j2t78dl1xySey7774RkdZOGjduXNx2221xzDHHxJAhQ2LIkCFx4oknFm6lbbp/TZs2LcaNG9fq69RTTy0st5R1W5Xr2GuvvRZHHHFEDBkyJLbZZpv4yU9+EnPmzCmaZnWuc9///vfjvPPOi4iIr371q4V1ausY7QjHHntsHHnkkfHII4/EtGnTIiI9Dpr+iG5v+Su6jkWk56n859v8mrN48eLYaaedCuud99FHHxUdg+2dI9u6fjS1ou+DnXWObmplz0MRa+YcHrF6n+HKTNPUm2++GTvvvHPst99+sWjRovjtb38b48aNi8WLF5d0boho/xzTXGvXn1U9T5ZaxqVLl8Y555xT2LZnnHFGi+N3ZY/xUr4Pl/KdvpSyNdXWflrK8dHePhax4uOwlN+UTfeh/Gf0/e9/v7DM0047LS666KL4+te/HhERixYtinHjxsX1119fVI5TTz21ME3E6h2HAO1K1gFjxoxJhg0blixbtqzVV95f//rXJCKSiEhGjRqVnH766cmwYcOSiEjuv//+JEmS5Ic//GESEcmiRYuSJEmSG264IYmIZMyYMclll12WHHrooUlEJPvss0+SJEly3XXXFb0/SZLkww8/TCIiOeecc1qU9c0330wOO+ywJCKSQw45JLn11luT2traQrmGDRuWnHHGGcmYMWOSiEh+8IMfJEmSJHV1dcnYsWMLy77ooosKZX/ppZda3S6TJ09OdtlllyQiku9973vJk08+mTzyyCNJRCSjR49Ozj///OToo49OKioqkohIZsyYkSRJkpxxxhlJRCRHHXVUcvHFFxeWe++997Za/pXdBkmSlFSOCRMmFLbLYYcdVljumDFjCvOZNGlSEhHJHnvskVx66aWF7RYRydy5c1td9nnnnVeY5pBDDkmOPvroQv/s2bOTJEmSGTNmFMpz9NFHF/aLoUOHFvaN888/v+jz2H///ZOISEaMGJE0NDQU9s3dd989Wbp0aXLOOeckEZFMmDAhufDCC4vWsaKiIjnxxBOTiRMnJhGRjB8/PkmSJHnttdeSiEjOOuusonUYNWpUYZrmfvCDHxTWZ/To0YXtlf9cR40alVx44YXJnnvumURE8utf/zpJkiRZvnx58sUvfjGJiGT33XdPzjrrrKSioiIZMWJEsnz58uTRRx9NIiJ54IEHWpT9yiuvTM4888zCZ5V3zz33JBGRVFZWJmeffXZhW3/ta18rTDNmzJikoqIiOeOMM5If/vCHSWVlZRIRyUcffdTq+q1o/yz18y1l/2ttO5ZyHL755ptJRUVFMmzYsOSSSy5JDjzwwMJ8/vznP7e6TosWLSo6N1122WXJ3nvvnUREYV855JBDkohI5s2bV3jfnXfemUREMnny5BbzbO34b3oO3GWXXZLvfOc7he394IMPJkmyaueaUspfyj547733JhGR7Lnnnsmll16aHHDAAUlEJGeeeWabx1Bzbe2nbZ1HWttOpRz/re0fpVxj5s+fXzgmvvOd7yRnnXVWYZpJkya1eo5dunRpoXxJkiSvvvpq4di74IILCsdeZWVlYf9oehzsueeeyamnnlronzNnTsnHQf4cliRJcv/99ycRkTz++OPJ8uXLk8rKymTs2LFF2/+UU05JIiKZP39+i8/mwQcfLJRhxIgRSUVFRTJz5sySzhMXX3xx4Rr5wx/+MBk1alQSEclbb72VJEmSHH744UlEJAcccEBy+eWXFz73Sy+9tDCPE044IYmIZOzYscmPfvSjwr5fWVmZJElSdC2uqKhITjrppMK+ceaZZ7bYv2bMmJGcfvrphdcZZ5yRjBgxIomI5Dvf+U6SJKWdA1flOvbhhx8mQ4cOTSIi+eY3v5kcc8wxhf21+ee/qte522+/vXAcn3LKKclNN920wmO0VPnvB6eeemqr4/PntbvvvjtJkiTZZZddCvtZe8tv7zq2aNGiwmd0yimnJBdeeGFhO06ZMiVJkiT54he/mFRUVCRLliwplOnaa69NIiJ58sknSzpHtnUdbqq1Y72zztFVVVVJRCQ/+tGPkiRZtfNQkqyZc3hHfIalTJP/Drdw4cLkP//5TzJ06NCkoqIiefXVV1uML+XckCTtn2Oaa+36s6rnyVLLeOKJJyYR6Xeon/zkJ4VzwOjRo9v9DNtSyvfhUr7Tt1e25lrbT0s5Ptrbx0o5Dku53jfdh5ruHyeddFLyi1/8ojDPUaNGJUnS8jjN23vvvQvTJMnqHYcA7VlnAsD8Sbi114svvpgkSePJ+oADDii8Nx+wnH766UmSFAeAdXV1hQtCbW1t4T35L9qTJ09O5s6dm0REcvjhhxfGX3311UlEJK+88kqr5c3/gMhfRPIX7crKykLokf/Bl//hddtttyURkZx//vmF+Xz00UdJRCT7779/m9smf/GpqalJkiRJjj/++CSi8cdd0/LmA5QxY8YUXWxnzpyZ7LLLLskNN9zQavlXZRuUUo78l4UXXnihME3zEKSioiIZPXp0snz58iRJ0i99+S957QWAF198cWHYZZddlkRE8thjjyVJkiTHHXdcEhHJo48+Wpgmf0G98sork4aGhmTEiBHJ6NGjk7q6usI0Bx10UFH5mv54zn+m+VC36Trmv5AmSZLss88+SUQUfnyMHTs2qaysTOrr65MkSZKXX345iYjkmmuuaXX98j88rrjiiiRJkqShoSF57733ivanJGn8EVFRUZHU1tYmd911VxIRyS9+8YvCNHfffXcSEcmNN97YZrCS3w+SpHF/e+aZZ5KGhobCl6GqqqrCNPmw4vnnn0/mzZuXRERy2mmnFcZPnjw52XPPPZOnn3661fVrb/8s5fMtZf9rbTuWchzm99Gm8z744IOTiNICwPy+U1dXl4wfPz6pqKhIFi1aVAhf8uuZJOn+VllZWdj/m2t+/OfPgfvss09hv33qqaeSiEguuOCCJElW7VxTSvlL2QfzX2rzAVVDQ0Ny8MEHJ6ecckqSJK0fQ821tZ+u6DzSfDu1d/wnSev7RynXmFtvvTWJiOSuu+4qTJM/ps8444yidcgfW80DwPyX+TfeeKMwj1tuuSWJiOSHP/xhkiSNx0HT4zn/z5p77rknSZLSrwetBYBJkiRnnXVWEhHJm2++mSRJ+qOqoqIiOfjgg1v9bPI/bCdOnFjYZqWcJ/L730EHHVTYb99///0kIpIjjzwyefHFFwt/5y1fvrwQAMybNy/517/+lUSkYVnz7d48AKysrEyqq6uTJEmSJUuWJEOHDi2cc5rvX009+eSTSUQabixZsqSkdUuSVbuO5QPdpvv1aaedVvg8OuI6lyRJctVVVyURkfznP/9JkiRp9xgtRXsBYH475sPbpgFge8tv7zr2y1/+MomI5Pe//31hfP7cNGHChCRJkuT2228vOgbyZRg2bFjJ14HWzg+taX6sd9Y5unmwsCrnoTV1Du+Iz7CUafLn/TfeeKMQtL388suF6VsLAFd0bijlHNOa5tefVT1PllLG/PWo6fGaL2N+mpU9xkv9PtzetbiUsrWm+X5ayvHR3j5WynFYyvW+6T70xhtvtDjv5f+ZtzIBYEcchwArss48A7CioqLN1n432mijov4DDjig8PeIESMiIq0K3txrr70W1dXVcd5558V6661XGL7ffvvFtddeG88880x8/vOfj4MOOih+97vfxVVXXRV9+/aNG2+8MUaPHr3SD63ea6+9YsMNN4yIiF69esUXv/jFwoOS//GPf0RExKabblrUKtWIESNavdWqLVdddVVceeWVEZHe7vDGG2/Ec889FxFRqKK+/fbbx29/+9vYa6+94uCDDy7c2taWQYMGrfQ2KKUcERGVlZWx/fbbF/q33377uOWWW6KmpiaWLFkS1dXVccQRR0R5ebqr9u3bN4488si44IIL2t0W//3f/134e7fddouIiOnTp0dExF//+teISKvrN28F7KmnnoqTTz45pk2bFsuXL48lS5bE22+/HVOnTo2ZM2dGRHqb6sCBA9stQ0S6737mM58p9H/+85+PBx54ID744IPYYost4ogjjojjjz8+nnrqqfjMZz4TkyZNioiIAw88cIXzzd9yksvl4vnnn4+IiFGjRhWtzyc/+cl49tln45133incZvzNb36zMH7ixInx4YcfxsYbbxyPPfZYq8vZZZddCn8fcsghce6558bLL78cH/vYx+Kdd96JCRMmxLPPPluYJr9dnnvuuRgzZkwMHTo0fvrTn8a8efNiv/32i89//vMrbGGz1P1zRZ9vqftfRPF2LOU4fO6552LixImx2WabFcYfffTRcfvtt7e5Tnl77713Yfv06NEjjjzyyDjmmGPirbfeis9//vNRUVERt912Wxx++OFRVVUVkyZNitNPP72w/5fqwAMPjB49ekREFFrXyz90f3XONSsq/1tvvRURK94Ht91224iI+MxnPhNHHnlk7LvvvkW3Lq+qFZ1HWjtOSzn+85ruH3krusZ87Wtfi6985StRVlYW06dPj7feeiueeeaZiGi577UmSZJ4+OGHY+LEifHJT36yMPwLX/hCRESLVlqb3nqW3775WxFX5jhozSGHHBIXX3xx/PGPf4wzzzwzHnvssaiuro7DDjtshe/bb7/9IiLdZrNmzWr3PJHfdkceeWRhv910001j1qxZMXjw4LjuuusiIoqe31teXh4HHHBAXHLJJfHqq68WbjPLLzu/PYYOHdridtwvfelLsf7660dERO/evWPbbbct3I7alunTp8f+++8flZWV8cc//jF69+5d0rptsskmq3Qd++c//xljx44t2q9/9KMfxTnnnBMDBgwoDFvd61xznXWMNpXfP1t7cH57y2/vOpZ/LMkhhxxSGP/xj388xowZE1OmTInly5fHF7/4xYiImDRpUnzxi1+M6dOnxxNPPBE/+tGPSr4O5LV2fihFZ52j81blPFTK94iO2D864jPMfydY0TR5e+yxR8yYMSP+/Oc/F8rflhWdG1588cWIKO0cU4qVPU+OGjWq3TK+8sorERFx6KGHFpVxwoQJhfPsyn6GuVyu5O/DK7oWl1K2UpRyfLS3j+WnW9FxmFfqb8p//etfEZEee3mjRo2K3XffvfCs1VKsqeMQ6L7WmQBwo402iksuuaTkafPKysqioqKi1da38s/BafpDPqIx8JgxY0ZERBx22GExadKkeOihh2L06NHx1FNPFX5UrYzNN9+8qL+ioqLwpeHdd9+NiDRIaE1NTU3hgr8i8+fPj5NPPjluuummwrDKysqiaS677LKYM2dO3H333TF58uSISL8g/e53v2tRxryV3QallCMi/cLWVN++fSMiff7F22+/HRFpy81NtVXG5pq2qNmvX7/CfCMi3njjjYhIA43m8st9+eWX44QTTognnniiMK6ioqKkZTfV/MHq+c8xX5YDDzwwjj/++Ljjjjti/PjxccMNN8T+++9fCIvbMnTo0MLf+S8sV199dasPVJ8xY0ZMnTq1xTrkcrnYeOON21zG7rvvHn369Cn05z/DN998s/Ajc8qUKbHnnnu2eG++TA888EB8/etfj9/97neF54kde+yx8dOf/rRo3nml7p8r+nxL3f8iirdje8dhVVVVvPHGG7H77ru3OY8Vyf9Az9tkk00iIuL999+P0aNHxzHHHBNXXHFFzJo1qxCSNv0yWaqm57Ty8vKic+DqnGtWVP5S9sFDDjkkXnvttbjwwgvjrLPOirPOOiuGDRsWN9xwwwqfF9WeFZ1HWlPK8Z/X2me7omtMQ0NDXHjhhXH55ZcXfiQMGzas1FWJ2traqK6ubrHcgQMHxtixY1s8I7VpWfLHU74sK3MctGbbbbeN0aNHx4033hhnnnlm3HrrrVFRURF77bXXCt/3sY99rPB3qeeJiGjRCnK+rB9++GGr43fbbbe45JJL4v333y98puPGjSuaZq+99op77723aFjza8r666+/wh/wNTU18aUvfSlmz54dzz//fOH9pazbql7Hnn/++RbbuW/fvoV9O291r3PNddYx2lT+M//EJz6x0stv7zr2/vvvx7Bhw1r80yT/jLDZs2fHZpttFscee2xcc801cfXVV8cf//jHiGg815Zyjswr9dzfXGedo/NW5Ty0ps7hHfEZljJN03JHRPz617+OiRMnrrBsKzo3rMw5phSrep4spYzNK0Z8/OMfLwRwq/IZlvp9eEXX4lLKVopSjo9Sv++u6DjMK/U35ZtvvhkR0eK7+yc+8YmVCgC78rsU0D1kqhGQvFKbY8+f1BcsWFA0vLa2NiIihg8fHhHpxb2ioiL++Mc/xl133RUREQcddNBKlyv/X6bWDB48OCIiXnrppfjwww9bvPJf7Nvz7W9/O2666aY45ZRTCrXMfv/73xdNM3DgwLjrrrti9uzZcdNNN8X+++8fjzzySJsX04iV3wallCNixdsk/8Om+edT6n8KV7QfVFZWxtChQ1vd1g888EAsXbo09txzz3jxxRfj0ksvjb///e+xcOHCOOGEE0padlMrWsd8Wfbff//4v//7v3j22WdjxowZ7dawiYiiL775/75ec801ra7TbrvtFoMGDYqIiLq6uqL5PPXUU4Uf2O2ZP39+RERsscUWhVoo3/zmN1td5llnnRUREZ/61KfipZdeitdeey2uvPLKGDNmTFxzzTXx85//vNVllLp/rujzLXX/iyjeju0dhxtssEFUVlZGVVVV0TyaP/y9VPntmf9Smq/NcO+998Yf/vCHGDFiRIwZM2al57uibdNR55rm5S9lHywrK4uf/OQnsWjRorj//vvjuOOOi3feeSd22223QsNLq6K9Y6y59o7/plqrfbmi7XvNNdfEueeeG7vuumvcfvvtMWPGjELNkVJaM+zbt29UVFQUtm1TCxYsiK233rrksqzMcdCWI488MqZOnRovv/xyXH/99XH44YdHr169VvieptuslPNEft9pfq6fOnVqTJs2rRAEtnWt3mKLLQrTNL8+tHa9WJn9paGhIb7xjW/Eyy+/HLfeemvR8VjKuq3qdWyjjTZqsQ8sWrQonn766Vi4cGFh2Opc51rTWcdoU3fffXdERKFG08osv73r2MYbbxwfffRRi/nma73lw5P8uXby5Mlx0003xYQJE2LLLbeMiJU7R65s7eym69mWjjhHr8p5aE2dwzviMyz1c45Ia02fdtppcd9997VoSKG5FZ0bVuYcU4qVPU+WUsb8PxaaniOal3FlP8OV+T68umUrRSnHR6nfd0v5vVjqb8r8PlfKub75P5yalqkrv0sB3UMmA8BSbbHFFhERceeddxYNv//++yOisZp8796944gjjoi77747Jk2aFHvvvXeLmghN5W8Fae0/RG3JfxH+xz/+ERtvvHFsvPHGMWjQoDj44IPjmGOOaXdZdXV1kSRJTJo0KSZMmBA/+9nP4gtf+EJssskm8eSTTxamybc2esQRR8RGG20Uhx56aNx1110xatSoQrX61sq/MtuglHKUYosttoiKior485//XDT8wQcfLOn9K7LDDjvEjBkzYs6cOYXtvWjRothtt93immuuialTp8bs2bPj1FNPjdNPPz0mTJgQffv2Ldx+19o65Ldb01tPSvWNb3wjZs+eHeeee25EROyzzz4r9f58LcNHHnmksD4bb7xxXHbZZbH33nvHnDlzCsFB09tpX3jhhdhpp53a/FL82GOPFbVEnN/22267beE/vXfeeWdsuOGGhWX+/e9/j9122y3+/ve/x2uvvRYjR46M3/zmN7HVVlvFySefXJhH85aPI6Kk/bM9q7P/lXIc7rLLLvHAAw8UhX6l3pbV/Nbn/A/hfM2MMWPGxIgRI+Lmm2+O++67r6hlzNY0Pf5LtarnmvbKX8o++J3vfCe22Wab6NGjR+yzzz5x9dVXF1oefe+991brGFqR5tupveN/deRv2bn99tvjoIMOis0226zQMmV++e1dI0aPHh333Xdf0Q+HadOmxRtvvFF0e9WKdNR5OP+PnjPOOCMiim+5K0Up54l88NL0OKqtrY3x48fHd77zncK+1byWTf6fUaNGjYodd9wxIqLwCIWIiDlz5hT1r4pzzz03/vSnP8X3vve9FrVxS1m3Vb2ObbfddvHII48UhYC/+c1vYvz48e3erpy3Kvt5e8fo6rrzzjvjnnvuiT322KNFmF3K8tu7jm2zzTZRXV1daCU8Ij2f3HnnnTF+/PhC6DJhwoQYOnRoXHbZZfHCCy/E4YcfXph+dc6Rza3J74NNrcp5aE2dwzviMyz1c45Iz6fnnXdeVFZWxre//e2Vqo3V1KqeY0q5TpdyLinFdtttFxFRFPBXV1cXPeJlZY/xVfk+vKpla03z/bSU42NVvu+urny58r8hI9J/2jT93pSvEZmv8RmR/iO16ffhjjgOAVaoS59AWKJ8K6LnnXdeq6/8Q5jzD2xt2mhBkqQP4M63nNe8FeD8Q85POeWU5Mknn0xuuOGGwgO78606JUnjw2EjIrn55ptXWN5nn3228EDzBx54oPDg3uYPRf7a175WeHDwvHnzkoqKiqSioiI5//zzk8mTJxdavsq3ztaan/70p4V5v/jii4WHx0+aNCl56aWXkksuuaRQ7nzrUd/73veSiEguuuii5Iknnig8TPmQQw5ptfyrsg1KKceECRNatHSbfyD5e++9lyRJY8uQZ555ZvLkk08Wyh4reHh6/uHo+QcuJ0njg3jzD4z+29/+lkSkLZhde+21yZ///OdCq2JvvfVWUltbW2gg5tFHH02efPLJQiuUEVFoRa7pA/STJCk87Pf6669PGhoaWl3HX//614Xl5C1evLiotcYVyT98vGnDNUmSJHvssUcSkbauNnny5OTCCy9MIiI59thjkyRJWwWsqKhIhg4dmtx7773J3XffXWjEoaamps3GFfbYY4/kiSeeSG666aakoqIi2XPPPQsPs//xj3+cRKQtkf3pT39KbrzxxqSioiIZNWpUsnTp0qShoSEZP358UllZmVx//fXJY489Vmgl7brrrmt1/drbP0v5fEvZ/1rbjqUch/l9Z5999kn+/ve/F1oSjyitEZATTjghmTJlSqG1xOYtQOcf5N/0OGhL8+N/RefAfCM+q3KuKbX87e2DDzzwQBKRNij08MMPJ7feemsydOjQokZwmh9DzbW2n7Z3Hmm+ndo7/pOk9f2jlGvML37xi8IDyl9++eXk1ltvLRzb+dYcm59jmzcCkm+MY4899kgee+yx5MEHH2zxEPvWjoOXXnqp6PxcynGwokZA8vINF+UbSWhL/uH2f/3rX4uGt3eeSJKk0BrtNddckzz66KOFffLZZ59N6uvrC5/PVVddlTz99NOFa/lRRx1VWE5+HqeffnpyzTXXFFoSbt4ISPNr8UEHHZQMHTo0SZLi/Sv/IPmKiork5ptvTm666aaiV6nrtirXsSlTphTm+/jjjyfXX399UllZWWg4oCOuc0nSshGQUo7RAw88sOjB+M3lGwEZO3Zs4XvaCSeckOy+++6F7dm0MYamjYC0t/z2rmP5ZQ8dOjS5++67k7///e+F4yD/meXlH+AfUdz6einnyLauw801P9Y76xzdvHGBVTkPJcmaOYd3xGdYyjTNW2jNt7Kb/y7RWiMgKzo3JEn755jWNL/+rOp5spQy1tXVFVq//c1vfpNMmTKl0J9vaKOUz7CpUr8Pt3ctLqVsrWm+n5ZyfLS3j5VyHJZyvW++j+Wvlddcc00yZcqUQmvJTVv4HT16dBIRydVXX53cddddhe/aTafpiOOwvfM00H2tMwFg/kLT2it/4X3kkUeSiLT13qYqKiqSY445JkmSlgFgbW1toZWt/GvixInJ7Nmzi+bRtIWupsFga2pra5Nddtml8IMpf9E+55xziqY75JBDir40vPjii0XrOmLEiBYtRTX35ptvFpqZP/7445MXXnihcOHIL/+Pf/xj0Ze+efPmFVrmyr8OOOCAwg+R5uVflW1QSjl22WWXFl8W8i1U/vvf/y4s86yzzip8cc1fnJt/YW8qf0HOf8ZJkiRTp05NItLWv/ImTZpU2HYRkeyyyy7JHXfcURh/6623Fr7cRaQtcl1//fVFP57Hjh2b7LHHHoX3NN2X3n///VbX8dprr00iInn77beLhh977LFJRHGLja3JBy/Nf3jMnj270Bpt/rg45phjirbTCy+8UPjykZ8m3xLiY489VvRlZ8KECUllZWUyfvz4omOj6fyWLVtW9CMqvy/lW+ZOkiR5/vnnkz333LNomrPPPrvNlm3b2z9L+XxL2f/a2o6lHIe33npr4ViIiOTEE09MIhpbX20uH6AddthhRfvciSee2GL5//nPfwr7Y3uaH/8rOgc2bUF1Zc81pZa/lH3we9/7XtE8Ro8eXfgBkSQtj6Hmmu+npZxHmm+nJGn/+G9t/yjlGjN//vyibVBRUZFcccUVyd57751UVFQk9fX1Lc6x+QDw7LPPLswz/+MvP5+xY8cmTz31VGF8a8dBvkXFW265JUmS0o6DpuewfAD4t7/9rWj98i2KNm11sTWTJ09OIiJ55JFHioaXcp6YNWtW4Qd8/vWTn/ykMP79998v/JjKb9dTTz216DxSU1OTHH/88cnQoUMLn8kee+xR+HHc1rW46Q/opvvXBRdcsMLvHqWu26pcx5Ikbe2y6T4wZsyYQsvQHXWdyx8r06dPLwxr7xgdOnRoUSjS3MyZM1vdXiNGjEiOPfbY5PXXXy+afvfddy8EgKUsf0XXsSRJkqeffrro2j1s2LDk+uuvb1HO119/PYlIg4Xm2jtHtnX9aK75sd5Z5+gFCxYkEZH8+Mc/TpJk1c5DSbJmzuFJ0jGfYXvT5I/ffIu5SZKGIs2P7+rq6pLODUnS/jmmNc2vP6t6niy1jFVVVUXn0jFjxiR77rlnUcjW3mfYXCnfh0u5FpdStuZa209LOT5WtI+VchyWcr1vvo/NnTu3xfqNGDGiKNx75plnCq3AR6QtFx9++OFF26AjjsP2ztNA97VOBIAdKV/jb8mSJUXDly9fnrz99tsthuctXbo0qaysLPxnqBRz5sxp98tha6qqqoq+jLenrq4umT17dtEPoTlz5iSzZs1a4fsWL16cvP76622uc/Pyr+o2aK8cpVi+fHny7rvvtvrfydX1wQcftFkLI0nSQKa9wLOpBQsWJAsWLFjpcuS/xNXV1a30e5uqra1N3n777RVuqzlz5iTvvvvuCpfV9L+5H3zwQdGX6Obq6uqSd999N1m8eHGb08ydOzd5++23S16/9vbPUqzO/lfKcfjvf/97pcvX0NCwwm2VD3HytXja09rxX6qVPdckSfvlT5L298H6+vrkjTfeSKqqqlodv6rH0Iq0tZ3aO/5X1eLFi5P33nuv3eNwRdeIhoaGZPr06cn8+fNXqyyrex7O/4h78803V6scpZwnFi1alLz55pttHleLFi1K3nnnnRbbdcaMGckNN9yQzJw5szCsvr4+GTVqVIsfpZ2hlHVbletYfh/48MMPV6t8K7uft3eMdrZSlt/edWzu3LltBlClWpVzZGvW1PfB5lb1PLSmzuEd8Rl2xOdcitU5x6zMdbqUc0kpampqVrhdVuUYX9nvw6tatta0tp+WcnyU8n23o1VVVSUzZsxIkiStNdo0AMybPn16u8fI6h6HAK3JJUmSRDdQX18fM2bMiKOPPjomT54cpa52dXV1vPrqq3HHHXfEFVdcEU8++WSMHz++k0u7drENOtdTTz0Vr7/+ehxxxBFxySWXFJ611dV23nnnqK+vLzwzjM41ffr0mD59epx33nmFB1WvTIMc0NFeeeWVmDlzZnz961+P7bffPv7yl790dZHa9M4778Tw4cNj//33j8suuyz69esX1113XZx33nlx4YUXxve+972uLiKwDnOOYVV84QtfiOnTp8err77a1UUBiIiIVWu+bB30yiuvxKc//emIiDjvvPNKft+MGTNip512ioiIk046qVsGX7ZB59prr72iuro6xo4du0qtDJMNf/nLXwoPsL7jjjuEf3S5iy66KG655ZaoqKiIX/ziF11dnBUaNmxYXHjhhXHRRRfFiBEjCsOPPvroteafKsC6yzkGgCzoNjUAq6qqYsqUKTF69OhCU/SlqKuri0ceeSQGDx4c22+/fcnNwWeJbdC5nnnmmaipqYmddtop+vTp09XFKfj3v/8dEY2t09G55s2bF//4xz9i1KhRhVaBoSu9/fbb8frrr8f48eNj8ODBXV2cktTX18cLL7wQCxcujDFjxsTAgQO7ukhAhjjHsDKmT58edXV1scUWW3R1UQAiohsFgAAAAADQHanKBQAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwrjzi/q8sAAAAAAHSSXJJE0tWFAAAAAAA6h1uAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAAAAAMkwACAAAAAAZJgAEAAAAgAwTAAIAAABAhgkAAQAAACDDBIAAAAAAkGECQAAAAADIMAEgAAAAAGSYABAAaKGuLuL99yM++CCioWHV3t+e+vqOmU8p0wAAQHcmAAQACmbMiDjiiIiePSOGDo3YbLOIAQMiTjstorp6xe9taIj45S8jhg9P39+/f8Q3vpGGiE09+mjEwQdHlJdHjBwZcemlxeOrqiJOPTViyJB0PptvHnH++RFLlqzcNM3ddVdELpd2V8bzz0fcccfKvae5//wnXfYxx6zefJqrrY246KLG9X788XQ5Dz3Usctpz8KF6XIvvHDl33vTTel7d9659fHjxkV87nPp3w88kE77xBOlz3/ZsvQ955xT+nsuuCB9z+LFbU/TfNvTtqeeSrfnvfd2dUkAoPsSAAIAEZGGf7vtFjFrVsTtt0fMnBnx7rsRP/95Gizts8+KQ8D//d+IE0+M2GSTiKuvjvja19JwZ5990hAmv4w99khDwZtuivjMZyLOPLM4BPzGNyKuvDJi990jfv3riO22izjvvIjTT1+5aZpLkuJuKZYti9hxx4iXXir9PSta9qrUplyRK66IOPvsxtqUFRUR48en4euatDrrd/31aXfKlIhXXmk5vr6+5XxX5jMsK0u3yaablv6eUvaV5tue9q3M5wYAdCwBIAAQM2em4d+OO0b8+c8RH/94xO9/H3H33RE77RTxyCNpGLbPPmnNp9b89KdpAPXYYxHHHRdx7bVpLb2XX4547rl0mjPPTLv33BNx6KERN9wQsffeET/+ccTSpWngeM89EQcdFPGHP6Q15u69N2L06LR24fLlpU2ztskHcoMGdex8mwdjY8ZEPPlkGnitC955J91fLrgg7c+HgR2pvDzdJscd17Hz7egwFwCgMwkAAYC4446IwYMjbr454pZbIv7rvyIefjjiH/+I2GqrNKh58MGIN99s/fbLJUvSQO9nP0sDl7zRo9PunDlpYHLPPREHHBAxcGDjNF//elqz8J//TJ/nd+qpESedVDz/7bdPuwsXljZNKZ57Lr299JFHIr70pfQWxZEj0xAxSdLbP/O3pV57bcSuu6Z/52913m679D077BBx662N862tTed77bXpNJtvHvGXv0RUVkaMGpVOc9116TLvuy8NWHO59D2PPto4n+XL01pm48al43O5iH33jXj99XT81VdHXHVV+veuu6Zh6AsvpNM/9VTjfO64o3EeI0dGfP/7jbetLluWjrvttjRIHTIkfZ14YvHtr3PmpNt7m23S+fTvnwZq8+eXtq3bcsstaff//b+IL34x4je/WfFtt+15+ul0fW68MV2PHXZo/JyvvrpxuieeSG9D798/nebOO9Np/vzn4vndc086PpeL+OxnI/72t3R4a9s+Ih02cmTjtj777DTYbsu3vpXeuv7Vr6ZlOeOMdPjbb0d8+cvpsCFD0hqvs2Y1vq+2Nv2Mhgxp3Hd+97viebc3j2uuifjv/464/PJ0mr33Tudz3nnF8/noo+LtV1MTcfLJ6a3+/ftH7LVXut819cQT6bLzt3ZPmdL2NgAA1pAkicTLy8vLy8ure7/23z+SSy+NpK4ukohILr64cdwVV0Tym9+kfx9+eCRnn136fPfcM53fO+9EMmtW+veZZxZP87e/pcPvuqv1edTUpOOHDm17OaVMc+ed6TR33pn2//WvaX9EJKNGRXL66ZEMG5b2339/JEuXRnLOOWn/hAmRXHhh+r4zzmh8z4UXNq7jr39dXJaISCor0zLddVckTzyRboMkieS88xqn2XPPSE49tbF/zpx0mvPPT/v32SeSiy5KP6OISEaMiKShIZLJkyPZZZd02Pe+F8mTT0by6KNp/wMPpPO44Ya0f8yYSC67LJJDD22cZ5JEUlvbuNyKikhOOqlxnk0/p/ywo46K5Mc/jmT8+LT/hBPS8VVVaf+PflT6vlFfn26bMWPS/ltvTedxyy3F040ZE8nuu6d/339/Os3jj7c+zwcfbFyfESPSdfrPfxq3UZJE8uabjdOcdlokX/taY/9vf9vy8znkkEiOPrqxf/bs1rf9vfc2fp6XXhrJAQe0vr83feW3Y0Qko0en+8GMGWm5I9Ll/vCHjfv2okXF++BRR6XH6tixaf+996bjS5nHD35QvOwxYyL54hfT9y1Z0ljGa69Np3nyyfT8kF9Wfr/MHzMvvdS4fSsq0uGXXBLJgQc2LufPf+76c52Xl5eXl1d3fUVXF8DLy8vLy8ura1/Ll6c/zp9+OpLXXkv/fv311qe98cY0AChlvr/5TWMAkSSRvPVWy3AxSdLgICINGprPo6EhksMOS8fffHPryyllmiRpOwA84IDGafLrf/rpaf/SpWn/D36Q9r/3XtqfD6SSpDEUqahIA7V8AFhZmQZj+TI2LUs+YPrFLxqHXXddOuyee9LpR4xIg5m6usZpDjoonWbevLQ/HxLW1KT9TQPAurq0TCNGpOXKz+OYY9JpJk9uDAArKyOprk7HL1mSBkWjR6f977yTTnPqqcX7TEVFY3i3KgHgE08Ub4Pq6rR/l12Kp1uVAHDixMbtnv8M8wHgwQen/W+80XK7Ng8Am+6rl12WDnvssda3fT6Uywe4DQ3psk45pe1tkA8An3mm8T3HHZcOe/TRxuny4eKVVzZuk/znkySRzJyZbrcbbkj7S5lHPgC84orGZd9+e3GQmCTpfIcNS8ffdls6/vzzG8d/9FE6bP/90/5DDkn7Z8xonCa/zQWAXl5eXl5eXfdyCzAAdHO5XPrsvrq69BWRNpyQt2RJ4/DlyyN69Wp/nr/7XXp74+jRjQ189OzZuLym8svKNxSSV1cXcfzxaWMhhx8eccghLZdTyjTtOeCAxr9HjEi7bTV28vzzaXfUqPQW6YcfTm/b/eQn0/e8807jtJ//fMQGG6R/N1/nvH33bfx7223Tbk1NOv20aentq0uWpM9RvO229FmNEW0/h7Gp115Ly/Ttb0est17j8P32S7vPPNM47Etfilh//fTv3r3TstTUpP1bbJE2dHHppRFz56a3F19/fUSfPo3TrIqbbkq7X/hCettvWVl6W+4TTzTe5ryq8uvY2nb/xz/Sz/mTn2w5fXP//d+Nf++2W9qdPr31afOf32c+E3HJJWmDJn/4Q3pb/IpUVESMHdtY3r/+Nf172bLGfSwvf2v39tun+8Ree6W3TdfXp7cnH354Or6UeeTl98FcLr0NOyJi0qTGdX3iibRl8Fwu3XYRaYMq+fm++GK6PR95JB333HMREyemLYjnHX30ircBAND5ytufBADIsh490hDm8cfTZ3tFRPzpT43PIzv66PQ5ZrffnoZde+654vn97GcRp52WNkjxwAMRAwakwzfcMO0uWFA8fb6/oqJx2LJl6TMFJ02KOPLItKXf5mFOKdOUYqONGv8uK0vL0VbLrv/+d9q9+uriZ8rlzZiRNqASEfGJT6zcsvv0Sbv5Zb/8csQJJxQ/c7HpNmpP/nlvTYOYiIhddmksa96QIcXTrL9+cSB7111pC8tNA86Ixs92ZS1alD4jMaI4iMu74YY0RFtVH/tY68MXL07X+9hji4d/5jOtT9+05eB+/dJuW41/HHJIGrpeeGHEWWelr2HD0nXJPz+yNVttVdz/xhtpd++9W0779ttp97LL0ucy3n13xOTJ6bA99kiD9803L20eeUOHNv7dt2+6ba65Jt2///jHdPjXvpZ233037bYV6FVVpcveffe2lwEAdA0BIAAQn/1s+qP/5JPT7rHHpi3rLlkS8eyzabDx4Ydp4HDXXW3P59JL05Z+99gjbVghXwMuIg0XKipa1qDKB1Gbb552ly5NG0W4++40hLzoouIaiaVOU6qVeV++8ZJrronYf//Wx+drS5ZSU7KtZS9dmgattbXpNv3MZ9LalBddlL5KkQ8Xmweu+dqDw4c3DuvRo+35vPRSxIEHprW8rrsubRTjU5+KmDAhrRG4Ku65J+0ee2zaUEpT554b8atfpY1jNK25uDLK2/iGmw9ZmzcU01ZNxpXZN8rKIn7yk7SRlccfT9fxV79Kaw7W1ra9Ls33k8rKdNizz7acNl+LduDA9Dj86KO0cZ477kiPhaOPbmxwpr155DXfVoccku7fkyentTQnTIjYcst03ODBafell9JlNLfBBunwqqri4atTUxQA6BhuAQYACjV8vvzliG9+M+LJJ9PbC48+Og3+NtwwvaV1990ba5A1d/PNafh3wAER999fHP7l7bVXGlzkW6GNSIPCiMZWfP/nf9Iw45JL0ldrIUwp03SEfI3C5cvTbr621iOPRGy8cePrssvS2lZz5nTMcqdOjZg9O23t+PTT0xCmb9+Iv/89HZ8PGfPly/c3tcUWaTe/ffPuvz/t5m9ZbU++BuJVV6W3dW+/fRqgPftsy9u2S5VvsfbCC9MQsOnryCPTW5fzIWFHyuXSz+mee4pbMM7f8rqy84po3Pbf+U7aSnKPHhH77JPWoDv11HTce++VPt8ddkhD8TlzGvevRYvSIPGaa9Ll7bprelvuRhultWDvuiu9LT1/i25781iRCRPSGnuXXZa27pu/rTiisRXrf/yjcb6DBqW3bh9zTDpul13Smr9NQ7/87cEAQNdRAxAAiIED0+d57b57GvQdfnhaw662Nn2O2dVXp4HS7be3rEEUkQZCxx+f/j1sWMTFFxeP/+//Tmt6nXxyelvhIYekNfceeyyd55VXpreTPvVUxG9/m9YUrK1Na4E1ddJJ6bPx2psmX1NvdeXX9Z57IkaOTLfLHnukZa6oSIPTZ5+N+OlP0/Bqs83SoGV1bb11Ov/bbktrZ663XnqLcz6My4cr+ef2XX55xFe+UjyPior0NtSLL06DqK9+NX223sknp7UJd9qptLJMmJB2r7kmDXvmzIn4wQ/SYc1rF+bNnJkGwRMnRpx9dvG46dPTWmsHHND653TooWmZr7km4qCDSivjyvjBD9KQavfd0yD59dfbf05fa5pv+z33bNwPDj00rZ03aVJaIy7/bMlSfO97aYB24IER3/1uGrL96EfprbVf+1paY2/nndOaoCNHpn+/9FIaGuefgdnePFakrCwN/s89N+0/8MDGcd/6VjqfM85IA+rPfCbixhvT/fLWW9NQNH+MH3xwWhvyzTfTfww0taL9AwDoJF3dComXl5eXl5fX2vOaOTOSCy6IZMKEtNXOiEj23jttKXTRorbfN3ly4/StvW66qXHaa65pHF5REclJJ6WtyiZJJD/84Yrn8+67pU3TWhn/9Kd0/J/+lPY/8khja7hNp6uoSFvKzfefeWbjvN9/P5LZsxtbNY3/vwXdY45pbJl30aKWLaU2f+VbkG26TV9+OR12yy1p/623xv/X3h2zxBGEYQCeVFYJwSJVwFQWFhaCYCMR/4GFlaWdYK/Cdbb+ABsLGwsFsUxjkTKVTUSQQFSIoCiIRVKom+JzOcU7NUZRPp8HBvZml7292eWKl535qp6e5vcMDZVqYSG25+fjmO3t+P5SSjUxERVqS4lquFUVVX6vXn+5rJB7cNDcX0qpGo3r1zc6GpWA68/T083vKZeVnaemYvv791KdnMT27Gwcv7MTn8fHb/72ubnYt7LSfnz6++OYHz9ie3g4+usqwF+/3v4crq83++oqwDMz14+rn/G+vrjuq2Pf6v5sbkbf4mLrsW81Tr29MT7tfufAwM2qx1VVquXl6+cZHLw+XsfHzWq7dRsZKdXR0f3P0WhE/9UK0XXb2op9o6M3921sxJjV5+3uvln9eWkpKgfXx0xONitc3/V8aJqmaZr2NO1NVZXqgdkhAJDY6Wm8bVSvm/aYzs6ioEZXV/v12l6Ses24d++afX/+lPLrVxT7eKopyKXEG3Pv37cvAHJ+XsrxcbxN124sz85K2d2NNxQ7Oh52HRcXMZX148f7rW/4Uq2txVgODzf7Vldj+vuXLzFN/b5ajf3FRRTa+PCh9TT4f7G/H/ers7P1/t+/475++tT+vt51joc6OYn/iNsKfOzuRoGZhz5zAMDjEQACAPBqjI+XsrAQod/nzzGlfGwspuzu7f1/aAcA8BIJAAEAeDV+/oz16a5WyH37Ntaj7Ot7rqsCAHhaAkAAAF6dw8NSvn2LojXd3VG9FwAgKwEgAAAAACT2hEtWAwAAAADPTQAIAAAAAIkJAAEAAAAgMQEgAAAAACQmAAQAAACAxASAAAAAAJCYABAAAAAAEhMAAgAAAEBiAkAAAAAASEwACAAAAACJCQABAAAAIDEBIAAAAAAkJgAEAAAAgMQEgAAAAACQmAAQAAAAABITAAIAAABAYgJAAAAAAEhMAAgAAAAAiQkAAQAAACAxASAAAAAAJCYABAAAAIDEBIAAAAAAkJgAEAAAAAASEwACAAAAQGICQAAAAABITAAIAAAAAIkJAAEAAAAgMQEgAAAAACQmAAQAAACAxASAAAAAAJCYABAAAAAAEhMAAgAAAEBiAkAAAAAASEwACAAAAACJCQABAAAAILG/WTMoiz93V3cAAAAASUVORK5CYII=\"\u003e",
"metadata": "{}",
"quality": [],
"quality-suggestion": null,
"quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **image** is of type `text`.
* **html_code** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **accuracy** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7].
* **quality** is of type `multi_label_selection` with the following allowed values ['clean code', 'efficient', 'proper tags and classes'].
* **correction** is of type `text`.
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **accuracy-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5, 6, 7].
* (optional) **quality-suggestion** is of type `multi_label_selection` with the following allowed values ['clean code', 'efficient', 'proper tags and classes'].
* (optional) **correction-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
To create this dataset we used the following snippet:
```python
import argilla as rg
from argilla.client.feedback.utils import image_to_html
from datasets import load_dataset
# Load the original dataset
dataset = load_dataset("HuggingFaceM4/WebSight", split='train', streaming=True)
# Shuffle the samples to avoid any bias
shuffled_dataset = dataset.shuffle(seed=50, buffer_size=5_000)
# Take a sample of 5000
subset = shuffled_dataset.take(5000)
# Format the text to be rendered in markdown
def add_json_formatting(example):
example['text'] = '```json\n' + example['text'] + '\n```'
return example
updated_subset = subset.map(add_json_formatting)
# Set a temporary path to save the image
temp_img_path = "temp_img.png"
# Iterate over the samples in the subset
records = []
for entry in updated_subset:
# Save the image to the temporary path
entry["image"].save(temp_img_path, format="png")
# Add the records to the FeedbackDataset
record = rg.FeedbackRecord(
fields={
"image": image_to_html(temp_img_path, file_type="png"),
"html_code": entry["text"],
},
suggestions = [
{
"question_name": "correction",
"value": entry["text"],
}],
)
ds.add_records(record, show_progress=True)
```
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hassansh/Mistral-7B-Instruct-v0.2 | ---
dataset_info:
features:
- name: subject
dtype: string
- name: accuracy
dtype: float64
- name: accuracy_abcd
dtype: float64
- name: abcd_avg_probs
sequence: float64
- name: abcd_std_probs
sequence: float64
- name: num_qs
dtype: int64
- name: time
dtype: float64
splits:
- name: test
num_bytes: 7138
num_examples: 57
download_size: 9199
dataset_size: 7138
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.