datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
dog/fuego-20230214-214112-1d6fb3 | ---
tags:
- fuego
fuego:
id: 20230214-214112-1d6fb3
status: done
script: run.py
requirements_file: requirements.txt
space_id: dog/fuego-20230214-214112-1d6fb3
space_hardware: cpu-basic
---
|
joey234/mmlu-abstract_algebra-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 1352.12
num_examples: 7
download_size: 3607
dataset_size: 1352.12
---
# Dataset Card for "mmlu-abstract_algebra-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-acronym_identification-default-01d2b7-2476976473 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- acronym_identification
eval_info:
task: entity_extraction
model: lewtun/autotrain-acronym-identification-7324788
metrics: ['bertscore', 'angelina-wang/directional_bias_amplification']
dataset_name: acronym_identification
dataset_config: default
dataset_split: validation
col_mapping:
tokens: tokens
tags: labels
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: lewtun/autotrain-acronym-identification-7324788
* Dataset: acronym_identification
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@wjenkins](https://huggingface.co/wjenkins) for evaluating this model. |
Falah/2M_arabic_architectural_futuristic_SDXL_refiner_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1428195862
num_examples: 2000000
download_size: 135721444
dataset_size: 1428195862
---
# Dataset Card for "2M_arabic_architectural_futuristic_SDXL_refiner_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-conceptual_physics-rule-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 41680
num_examples: 235
download_size: 24838
dataset_size: 41680
---
# Dataset Card for "mmlu-conceptual_physics-rule-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shidowake/augmxnt_ultra-orca-boros-en-ja-v1_split_14 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 20639999.933149945
num_examples: 9397
download_size: 10534084
dataset_size: 20639999.933149945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
youngryu/CustomDataSet | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
eren23/tr-snli-small | ---
license: cc-by-4.0
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 68011
num_examples: 441
download_size: 36196
dataset_size: 68011
---
|
EitanG98/asl_letters | ---
license: unlicense
---
|
ivelin/rico_sca_refexp_synthetic | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
pretty_name: RICO SCA RefExp
size_categories:
- 10K<n<100K
dataset_info:
- config_name: rico_sca_refexp
features:
- name: image
dtype: image
- name: image_id
dtype: string
- name: labels
list:
- name: prompt
dtype: string
- name: target_bounding_box
struct:
- name: xmin
dtype: float32
- name: ymin
dtype: float32
- name: xmax
dtype: float32
- name: ymax
dtype: float32
splits:
- name: train
num_bytes: 2605508469
num_examples: 24063
- name: validation
num_bytes: 21192787
num_examples: 160
- name: test
num_bytes: 22057836
num_examples: 185
download_size: 6514703641
dataset_size: 2605508469
---
This dataset is derived from the RICO SCA presented by Google Research in the seq2act paper. This is a synthetically generated dataset for UI RefExp task.
See original repo for details and licensing info:
https://github.com/google-research/google-research/blob/master/seq2act/data_generation/README.md#generate-ricosca-dataset
The splits in this dataset are consistent with the splits in the crowdsourced [UIBert RefExp](https://huggingface.co/datasets/ivelin/ui_refexp_saved) dataset. Training split examples do not include images from the Validation or Test examples in the UI Bert RefExp dataset. Respectively the images in Validation and Test splits here match the images in the Validation and Test splits of UIBert RefExp.
|
LNTANOooo/sharegpt52k | ---
dataset_info:
features:
- name: conversation
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 732668006
num_examples: 58390
download_size: 303887756
dataset_size: 732668006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yleo/aqua-binarized-1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 21307
num_examples: 10
download_size: 30225
dataset_size: 21307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/63503_Traffic_Accident_Videos_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
63,503 Traffic Accident Videos Data. The data covers highways, crossroads,rural road, etc. The data includes multiple scenes, different time, multiple weather distribution(sunny, cloudy, rainy, snowy ), multiple photographic devices. The data can be used for tasks such as traffic accident detection.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1060?source=Huggingface
# Specifications
## Data size
63,503 videos, including 9,691 videos shot by surveillance cameras, 46,949 videos shot by automobile data recorders, 3,189 videos shot by cellphones, 3,674 videos shot by cameras
## Collecting environment
including highway, crossroad, rural road, etc.
## Diversity
multiple scenes, different time, multiple weather distribution(sunny, cloudy, rainy, snowy ), multiple photographic devices
## Device
surveillance camera, automobile data recorder, cellphone, camera
## Collecting time
day, night
## Image Parameter
the video data format is .mp4
# Licensing Information
Commercial License
|
zhixing-xu/train_cfn | ---
license: apache-2.0
---
|
tyzhu/squad_qa_wrong_rare_v5_full | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7374288
num_examples: 5070
- name: validation
num_bytes: 349767
num_examples: 300
download_size: 1503736
dataset_size: 7724055
---
# Dataset Card for "squad_qa_wrong_rare_v5_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dadofalin/coderefine0824 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: function
dtype: string
- name: validationType
dtype: string
- name: fixed
dtype: string
splits:
- name: train
num_bytes: 1024101
num_examples: 324
download_size: 317188
dataset_size: 1024101
---
# Dataset Card for "coderefine0824"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Liareizz/LIAREIZZ | ---
license: openrail
---
|
lapki/perekrestok-reviews | ---
task_categories:
- text-classification
- text-generation
language:
- ru
tags:
- reviews
size_categories:
- 100K<n<1M
pretty_name: Dataset of user reviews from "Перекрёсток/Perekrestok" shop.
---
### Dataset
Dataset of user reviews from "Перекрёсток/Perekrestok" shop.
### Dataset Format
Dataset is in JSONLines format. Trivia:
`product_id` - Product internal ID (https://www.perekrestok.ru/cat/1/p/ID)
`product_name` - Product name
`product_category` - Category of product
`product_price` - Product price in RUB (decimal)
`review_id` - Review internal ID
`review_author` - Author of review
`review_text` - Text of review
`rating` - Review rating (decimal, from 0.0 to 5.0)
|
liuyanchen1015/VALUE_cola_uninflect | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 12887
num_examples: 172
- name: test
num_bytes: 13595
num_examples: 185
- name: train
num_bytes: 95853
num_examples: 1323
download_size: 62088
dataset_size: 122335
---
# Dataset Card for "VALUE_cola_uninflect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hyeoli/layoutlmv3_cord | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': I-menu.cnt
'2': I-menu.discountprice
'3': I-menu.nm
'4': I-menu.num
'5': I-menu.price
'6': I-menu.sub_cnt
'7': I-menu.sub_nm
'8': I-menu.sub_price
'9': I-menu.unitprice
'10': I-sub_total.discount_price
'11': I-sub_total.etc
'12': I-sub_total.service_price
'13': I-sub_total.subtotal_price
'14': I-sub_total.tax_price
'15': I-total.cashprice
'16': I-total.changeprice
'17': I-total.creditcardprice
'18': I-total.emoneyprice
'19': I-total.menuqty_cnt
'20': I-total.menutype_cnt
'21': I-total.total_etc
'22': I-total.total_price
- name: image
dtype: image
splits:
- name: train
num_bytes: 1296349383.0
num_examples: 800
- name: test
num_bytes: 162954804.0
num_examples: 100
- name: validation
num_bytes: 171507971.0
num_examples: 100
download_size: 1628026145
dataset_size: 1630812158.0
---
# Dataset Card for "layoutlmv3_cord"
## Original Dataset is "naver-clova-ix/cord-v2"
### This dataset is modified for learning.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
k0ntra/tehranen2 | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
splits:
- name: train
num_bytes: 294912
num_examples: 96
download_size: 673328
dataset_size: 294912
---
# Dataset Card for "tehranen2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713226767 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2337331
num_examples: 7215
download_size: 1318515
dataset_size: 2337331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nsohko/imda-dataset | ---
dataset_info:
- config_name: CHANNEL0FCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0FINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0FMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0FOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0Fall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0MCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0MINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0MMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0MOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0Mall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0allCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0allINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0allMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0allOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL0allall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1FCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1FINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1FMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1FOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1Fall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1MCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1MINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1MMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1MOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1Mall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1allCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1allINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1allMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1allOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL1allall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2FCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2FINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2FMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2FOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2Fall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2MCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2MINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2MMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2MOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2Mall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2allCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2allINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2allMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2allOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: CHANNEL2allall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 105139921
num_examples: 682
- name: test
num_bytes: 103309694
num_examples: 693
download_size: 0
dataset_size: 208449615
- config_name: allFCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allFINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allFMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allFOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allFall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allMCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allMINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allMMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allMOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allMall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allallCHINESE
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allallINDIAN
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allallMALAY
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allallOTHERS
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
- config_name: allallall
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcript
dtype: string
- name: mic
dtype: string
- name: audio_name
dtype: string
splits:
- name: train
num_bytes: 315419763
num_examples: 2046
- name: test
num_bytes: 309929082
num_examples: 2079
download_size: 0
dataset_size: 625348845
---
|
CyberHarem/imai_lisa_bangdream | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of imai_lisa/今井リサ (BanG Dream!)
This is the dataset of imai_lisa/今井リサ (BanG Dream!), containing 500 images and their tags.
The core tags of this character are `brown_hair, long_hair, bangs, green_eyes, earrings, breasts, ponytail, sidelocks, half_updo`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 685.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imai_lisa_bangdream/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 384.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imai_lisa_bangdream/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1184 | 825.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imai_lisa_bangdream/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 600.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imai_lisa_bangdream/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1184 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/imai_lisa_bangdream/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/imai_lisa_bangdream',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | blush, 1girl, solo_focus, hetero, nipples, open_mouth, sex, vaginal, 1boy, pussy, large_breasts, penis, sweat, jewelry, navel, spread_legs, completely_nude, looking_at_viewer, mosaic_censoring, smile, collarbone, medium_breasts, on_back |
| 1 | 27 |  |  |  |  |  | 1girl, off-shoulder_sweater, smile, solo, bare_shoulders, long_sleeves, looking_at_viewer, sweater_dress, blush, collarbone, necklace, ribbed_sweater, black_belt, simple_background, white_background, medium_breasts, wavy_hair, hair_between_eyes, open_mouth, pendant, sleeves_past_wrists, black_thighhighs, cleavage, sitting, :3 |
| 2 | 11 |  |  |  |  |  | 1girl, beret, long_sleeves, solo, white_shirt, blush, jewelry, looking_at_viewer, smile, red_headwear, simple_background, upper_body, collarbone, open_mouth, plaid_skirt, shoulder_bag, wavy_hair, closed_mouth, grey_skirt, one_eye_closed, white_background |
| 3 | 10 |  |  |  |  |  | 1girl, collared_shirt, grey_jacket, jewelry, long_sleeves, looking_at_viewer, school_uniform, solo, striped_necktie, white_shirt, blazer, smile, plaid_skirt, pleated_skirt, simple_background, blush, white_background, brown_necktie, cowboy_shot, miniskirt, brown_skirt, closed_mouth |
| 4 | 7 |  |  |  |  |  | 1girl, blush, collared_shirt, jewelry, school_uniform, solo, sweater_vest, white_shirt, open_mouth, short_sleeves, upper_body, :d, looking_at_viewer, simple_background, striped_necktie, white_background, blue_necktie, hair_between_eyes |
| 5 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, collarbone, day, outdoors, smile, bare_shoulders, blue_sky, cleavage, navel, standing, closed_mouth, cloud, cowboy_shot, ocean, large_breasts, medium_breasts, frilled_bikini, hair_between_eyes, stomach, wavy_hair, blurry_background, bracelet, groin, halterneck, multi-strapped_bikini, side-tie_bikini_bottom, water |
| 6 | 15 |  |  |  |  |  | 1girl, hair_flower, solo, looking_at_viewer, smile, blush, red_rose, frills, necklace, bare_shoulders, gloves, veil, black_dress |
| 7 | 8 |  |  |  |  |  | 1girl, black_feathers, feather_hair_ornament, hair_flower, looking_at_viewer, smile, solo, black_choker, detached_sleeves, dress, lace_choker, brooch, long_sleeves, upper_body, blush, red_bowtie, black_rose, blue_rose, electric_guitar, frills, holding, lace-trimmed_sleeves, neck_ribbon, red_ribbon, simple_background, white_background |
| 8 | 5 |  |  |  |  |  | feather_hair_ornament, hair_flower, hairband, looking_at_viewer, purple_rose, red_rose, smile, blue_feathers, cross-laced_clothes, crown, necklace, one_eye_closed, solo, 1girl, ;d, black_choker, blue_jacket, blue_rose, cleavage, long_sleeves, open_mouth, simple_background, upper_body, white_background, black_feathers, black_ribbon, blush, corset, cropped_jacket, dress, holding, multiple_girls, round_teeth |
| 9 | 6 |  |  |  |  |  | cleavage, collarbone, crop_top, hair_bow, looking_at_viewer, midriff, denim_shorts, hair_flower, heart, medium_breasts, navel, necklace, short_shorts, smile, 1girl, bare_shoulders, belt, black_bow, black_gloves, black_jacket, blush, choker, hoop_earrings, one_side_up, solo, stomach, cowboy_shot, hand_up, large_breasts, off_shoulder, open_jacket, spaghetti_strap, thigh_strap, thighhighs, wavy_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | solo_focus | hetero | nipples | open_mouth | sex | vaginal | 1boy | pussy | large_breasts | penis | sweat | jewelry | navel | spread_legs | completely_nude | looking_at_viewer | mosaic_censoring | smile | collarbone | medium_breasts | on_back | off-shoulder_sweater | solo | bare_shoulders | long_sleeves | sweater_dress | necklace | ribbed_sweater | black_belt | simple_background | white_background | wavy_hair | hair_between_eyes | pendant | sleeves_past_wrists | black_thighhighs | cleavage | sitting | :3 | beret | white_shirt | red_headwear | upper_body | plaid_skirt | shoulder_bag | closed_mouth | grey_skirt | one_eye_closed | collared_shirt | grey_jacket | school_uniform | striped_necktie | blazer | pleated_skirt | brown_necktie | cowboy_shot | miniskirt | brown_skirt | sweater_vest | short_sleeves | :d | blue_necktie | day | outdoors | blue_sky | standing | cloud | ocean | frilled_bikini | stomach | blurry_background | bracelet | groin | halterneck | multi-strapped_bikini | side-tie_bikini_bottom | water | hair_flower | red_rose | frills | gloves | veil | black_dress | black_feathers | feather_hair_ornament | black_choker | detached_sleeves | dress | lace_choker | brooch | red_bowtie | black_rose | blue_rose | electric_guitar | holding | lace-trimmed_sleeves | neck_ribbon | red_ribbon | hairband | purple_rose | blue_feathers | cross-laced_clothes | crown | ;d | blue_jacket | black_ribbon | corset | cropped_jacket | multiple_girls | round_teeth | crop_top | hair_bow | midriff | denim_shorts | heart | short_shorts | belt | black_bow | black_gloves | black_jacket | choker | hoop_earrings | one_side_up | hand_up | off_shoulder | open_jacket | spaghetti_strap | thigh_strap | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------|:---------|:----------|:-------------|:------|:----------|:-------|:--------|:----------------|:--------|:--------|:----------|:--------|:--------------|:------------------|:--------------------|:-------------------|:--------|:-------------|:-----------------|:----------|:-----------------------|:-------|:-----------------|:---------------|:----------------|:-----------|:-----------------|:-------------|:--------------------|:-------------------|:------------|:--------------------|:----------|:----------------------|:-------------------|:-----------|:----------|:-----|:--------|:--------------|:---------------|:-------------|:--------------|:---------------|:---------------|:-------------|:-----------------|:-----------------|:--------------|:-----------------|:------------------|:---------|:----------------|:----------------|:--------------|:------------|:--------------|:---------------|:----------------|:-----|:---------------|:------|:-----------|:-----------|:-----------|:--------|:--------|:-----------------|:----------|:--------------------|:-----------|:--------|:-------------|:------------------------|:-------------------------|:--------|:--------------|:-----------|:---------|:---------|:-------|:--------------|:-----------------|:------------------------|:---------------|:-------------------|:--------|:--------------|:---------|:-------------|:-------------|:------------|:------------------|:----------|:-----------------------|:--------------|:-------------|:-----------|:--------------|:----------------|:----------------------|:--------|:-----|:--------------|:---------------|:---------|:-----------------|:-----------------|:--------------|:-----------|:-----------|:----------|:---------------|:--------|:---------------|:-------|:------------|:---------------|:---------------|:---------|:----------------|:--------------|:----------|:---------------|:--------------|:------------------|:--------------|:-------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 27 |  |  |  |  |  | X | X | | | | X | | | | | | | | | | | | X | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | | | | X | | | | | | | | X | | | | X | | X | X | | | | X | | X | | | | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | | | | | | | | | | | | X | | | | X | | X | | | | | X | | X | | | | | X | X | | | | | | | | | | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | | X | | | | | | | | X | | | | X | | | | | | | X | | | | | | | X | X | | X | | | | | | | | X | | X | | | | | | X | | X | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | X | | | | | | | | | X | | | | X | | | X | | X | X | X | | | X | X | | | | | | | | X | X | | | | X | | | | | | | | | X | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | X | | X | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | X | | X | | | | | X | | X | | | | | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | X | | | | X | | | | | | | | | | | | X | | X | | | | | X | | X | | X | | | X | X | | | | | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | X | X | | X | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | X | | | | | | | | | X | | | | X | | | X | | X | X | X | | | X | X | | | X | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
GGWON/jnstyle | ---
license: afl-3.0
---
|
multilingual_librispeech | ---
pretty_name: MultiLingual LibriSpeech
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- de
- es
- fr
- it
- nl
- pl
- pt
license:
- cc-by-4.0
multilinguality:
- multilingual
paperswithcode_id: librispeech-1
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- audio-classification
task_ids:
- speaker-identification
dataset_info:
- config_name: polish
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 16136430
num_examples: 25043
- name: train.9h
num_bytes: 1383232
num_examples: 2173
- name: train.1h
num_bytes: 145411
num_examples: 238
- name: validation
num_bytes: 318964
num_examples: 512
- name: test
num_bytes: 332317
num_examples: 520
download_size: 6609569551
dataset_size: 18316354
- config_name: german
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 277089334
num_examples: 469942
- name: train.9h
num_bytes: 1325460
num_examples: 2194
- name: train.1h
num_bytes: 145998
num_examples: 241
- name: validation
num_bytes: 2160779
num_examples: 3469
- name: test
num_bytes: 2131177
num_examples: 3394
download_size: 122944886305
dataset_size: 282852748
- config_name: dutch
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 218648573
num_examples: 374287
- name: train.9h
num_bytes: 1281951
num_examples: 2153
- name: train.1h
num_bytes: 141672
num_examples: 234
- name: validation
num_bytes: 1984165
num_examples: 3095
- name: test
num_bytes: 1945428
num_examples: 3075
download_size: 92158429530
dataset_size: 224001789
- config_name: french
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 162009691
num_examples: 258213
- name: train.9h
num_bytes: 1347707
num_examples: 2167
- name: train.1h
num_bytes: 146699
num_examples: 241
- name: validation
num_bytes: 1482961
num_examples: 2416
- name: test
num_bytes: 1539152
num_examples: 2426
download_size: 64474642518
dataset_size: 166526210
- config_name: spanish
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 136743162
num_examples: 220701
- name: train.9h
num_bytes: 1288180
num_examples: 2110
- name: train.1h
num_bytes: 138734
num_examples: 233
- name: validation
num_bytes: 1463115
num_examples: 2408
- name: test
num_bytes: 1464565
num_examples: 2385
download_size: 53296894035
dataset_size: 141097756
- config_name: italian
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 36008104
num_examples: 59623
- name: train.9h
num_bytes: 1325927
num_examples: 2173
- name: train.1h
num_bytes: 145006
num_examples: 240
- name: validation
num_bytes: 732210
num_examples: 1248
- name: test
num_bytes: 746977
num_examples: 1262
download_size: 15395281399
dataset_size: 38958224
- config_name: portuguese
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 23036487
num_examples: 37533
- name: train.9h
num_bytes: 1305698
num_examples: 2116
- name: train.1h
num_bytes: 143781
num_examples: 236
- name: validation
num_bytes: 512463
num_examples: 826
- name: test
num_bytes: 549893
num_examples: 871
download_size: 9982803818
dataset_size: 25548322
---
# Dataset Card for MultiLingual LibriSpeech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [MultiLingual LibriSpeech ASR corpus](http://www.openslr.org/94)
- **Repository:** [Needs More Information]
- **Paper:** [MLS: A Large-Scale Multilingual Dataset for Speech Research](https://arxiv.org/abs/2012.03411)
- **Leaderboard:** [Paperswithcode Leaderboard](https://paperswithcode.com/dataset/multilingual-librispeech)
### Dataset Summary
<div class="course-tip course-tip-orange bg-gradient-to-br dark:bg-gradient-to-r before:border-orange-500 dark:before:border-orange-800 from-orange-50 dark:from-gray-900 to-white dark:to-gray-950 border border-orange-50 text-orange-700 dark:text-gray-400">
<p><b>Deprecated:</b> This legacy dataset doesn't support streaming and is not updated. Use "facebook/multilingual_librispeech" instead.</p>
</div>
Multilingual LibriSpeech (MLS) dataset is a large multilingual corpus suitable for speech research. The dataset is derived from read audiobooks from LibriVox and consists of 8 languages - English, German, Dutch, Spanish, French, Italian, Portuguese, Polish.
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`, `audio-speaker-identification`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active leaderboard which can be found at https://paperswithcode.com/dataset/multilingual-librispeech and ranks models based on their WER.
### Languages
The dataset is derived from read audiobooks from LibriVox and consists of 8 languages - English, German, Dutch, Spanish, French, Italian, Portuguese, Polish
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file, usually called `file` and its transcription, called `text`. Some additional information about the speaker and the passage which contains the transcription is provided.
```
{'chapter_id': 141231,
'file': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'audio': {'path': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/b7ded9969e09942ab65313e691e6fc2e12066192ee8527e21d634aca128afbe2/dev_clean/1272/141231/1272-141231-0000.flac',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346,
0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 16000},
'id': '1272-141231-0000',
'speaker_id': 1272,
'text': 'A MAN SAID TO THE UNIVERSE SIR I EXIST'}
```
### Data Fields
- file: A path to the downloaded audio file in .flac format.
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- id: unique id of the data sample.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
- chapter_id: id of the audiobook chapter which includes the transcription.
### Data Splits
| | Train | Train.9h | Train.1h | Dev | Test |
| ----- | ------ | ----- | ---- | ---- | ---- |
| german | 469942 | 2194 | 241 | 3469 | 3394 |
| dutch | 374287 | 2153 | 234 | 3095 | 3075 |
| french | 258213 | 2167 | 241 | 2416 | 2426 |
| spanish | 220701 | 2110 | 233 | 2408 | 2385 |
| italian | 59623 | 2173 | 240 | 1248 | 1262 |
| portuguese | 37533 | 2116 | 236 | 826 | 871 |
| polish | 25043 | 2173 | 238 | 512 | 520 |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Public Domain, Creative Commons Attribution 4.0 International Public License ([CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode))
### Citation Information
```
@article{Pratap2020MLSAL,
title={MLS: A Large-Scale Multilingual Dataset for Speech Research},
author={Vineel Pratap and Qiantong Xu and Anuroop Sriram and Gabriel Synnaeve and Ronan Collobert},
journal={ArXiv},
year={2020},
volume={abs/2012.03411}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
Dampish/DDDC | ---
license: cc-by-nc-4.0
---
|
Ru3ll/TreeImageDataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': test
'1': train
'2': val
splits:
- name: train
num_bytes: 1882655120.0
num_examples: 922
download_size: 1882708375
dataset_size: 1882655120.0
---
# Dataset Card for "TreeImageDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zuko2/conditional-translation-me-en-me | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 23020866
num_examples: 112170
- name: valid
num_bytes: 105436
num_examples: 1000
- name: test
num_bytes: 52976
num_examples: 500
download_size: 12300184
dataset_size: 23179278
---
|
andersonbcdefg/sft_language_submix | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 3360236870.066389
num_examples: 2339239
download_size: 1943869500
dataset_size: 3360236870.066389
---
# Dataset Card for "sft_language_submix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Caltech101_not_background_test_facebook_opt_2.7b_Visclues_ns_5647_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 86816245.125
num_examples: 5647
- name: fewshot_3_bs_16
num_bytes: 90734679.125
num_examples: 5647
download_size: 169650032
dataset_size: 177550924.25
---
# Dataset Card for "Caltech101_not_background_test_facebook_opt_2.7b_Visclues_ns_5647_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SerahAKojenu/Masakhane-news | ---
task_categories:
- text-classification
language:
- en
- yo
tags:
- biology
- finance
size_categories:
- n<1K
---
TODO: Add YAML tags here. Copy-paste the tags obtained with the online tagging app: https://huggingface.co/spaces/huggingface/datasets-tagging
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
CyberHarem/pa_15_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pa_15/PA-15/PA-15 (Girls' Frontline)
This is the dataset of pa_15/PA-15/PA-15 (Girls' Frontline), containing 265 images and their tags.
The core tags of this character are `blue_eyes, breasts, twintails, symbol-shaped_pupils, long_hair, blue_hair, heart-shaped_pupils, bangs, small_breasts, hair_between_eyes, hair_ornament, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 265 | 437.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pa_15_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 265 | 204.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pa_15_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 708 | 490.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pa_15_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 265 | 364.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pa_15_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 708 | 759.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pa_15_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pa_15_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 36 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, heart, nurse_cap, black_gloves, dress, white_thighhighs, open_mouth, holding_syringe, id_card, blush, grey_hair, intravenous_drip, white_background, pill, simple_background, black_panties, messy_hair |
| 1 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_background, black_leotard, heart, bare_shoulders, smile, simple_background, black_gloves, blue_thighhighs, highleg_leotard, covered_navel, single_thighhigh, blush, grey_hair, covered_nipples, jacket |
| 2 | 26 |  |  |  |  |  | 1girl, china_dress, looking_at_viewer, solo, twin_braids, blue_thighhighs, official_alternate_costume, black_gloves, pelvic_curtain, no_panties, heart, bare_shoulders, blush, half_gloves, smile, thighs, covered_navel, white_background, blue_dress, simple_background, open_mouth, choker, sitting |
| 3 | 19 |  |  |  |  |  | 1girl, official_alternate_costume, bare_shoulders, solo, hair_ribbon, looking_at_viewer, off_shoulder, blush, collarbone, long_sleeves, smile, black_choker, thigh_strap, white_background, blue_ribbon, barefoot, glasses, heart_print, two_side_up, white_shirt, blue-framed_eyewear, bottomless, simple_background, blue_nails, holding, naked_shirt, no_panties |
| 4 | 12 |  |  |  |  |  | 1girl, fox_ears, official_alternate_costume, school_uniform, solo, white_shirt, black_skirt, collared_shirt, blush, fox_tail, heart, long_sleeves, looking_at_viewer, simple_background, white_thighhighs, animal_ear_fluff, fox_girl, hairclip, pleated_skirt, black_choker, plaid_skirt, white_background, blue_bowtie, smile, open_mouth, sweater_vest, thighs, fang, miniskirt, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | heart | nurse_cap | black_gloves | dress | white_thighhighs | open_mouth | holding_syringe | id_card | blush | grey_hair | intravenous_drip | white_background | pill | simple_background | black_panties | messy_hair | black_leotard | bare_shoulders | blue_thighhighs | highleg_leotard | covered_navel | single_thighhigh | covered_nipples | jacket | china_dress | twin_braids | official_alternate_costume | pelvic_curtain | no_panties | half_gloves | thighs | blue_dress | choker | sitting | hair_ribbon | off_shoulder | collarbone | long_sleeves | black_choker | thigh_strap | blue_ribbon | barefoot | glasses | heart_print | two_side_up | white_shirt | blue-framed_eyewear | bottomless | blue_nails | holding | naked_shirt | fox_ears | school_uniform | black_skirt | collared_shirt | fox_tail | animal_ear_fluff | fox_girl | hairclip | pleated_skirt | plaid_skirt | blue_bowtie | sweater_vest | fang | miniskirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:--------|:------------|:---------------|:--------|:-------------------|:-------------|:------------------|:----------|:--------|:------------|:-------------------|:-------------------|:-------|:--------------------|:----------------|:-------------|:----------------|:-----------------|:------------------|:------------------|:----------------|:-------------------|:------------------|:---------|:--------------|:--------------|:-----------------------------|:-----------------|:-------------|:--------------|:---------|:-------------|:---------|:----------|:--------------|:---------------|:-------------|:---------------|:---------------|:--------------|:--------------|:-----------|:----------|:--------------|:--------------|:--------------|:----------------------|:-------------|:-------------|:----------|:--------------|:-----------|:-----------------|:--------------|:-----------------|:-----------|:-------------------|:-----------|:-----------|:----------------|:--------------|:--------------|:---------------|:-------|:------------|
| 0 | 36 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | X | X | | X | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 26 |  |  |  |  |  | X | X | X | X | X | | X | | | X | | | X | | | X | | X | | | | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 19 |  |  |  |  |  | X | X | X | X | | | | | | | | | X | | | X | | X | | | | X | | | | | | | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | X | X | X | | | | X | X | | | X | | | X | | X | | | | | | | | | | | | | X | | | | X | | | X | | | | X | X | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Nexdata/Mandarin_Mobile_Telephony_Conversational_Speech_Collection_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Mandarin_Mobile_Telephony_Conversational_Speech_Collection_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1055?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
4491 speakers participated in the recording and conducted face-to-face communication in a natural way. no topics are specified, with a wide range of fields; the voice was natural and fluent, in line with the actual dialogue scene. Text is transferred manually, with high accuracy.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1055?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Mandarin Chinese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
FanChen0116/bus_few4_64x | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-from_location
'2': B-from_location
'3': B-leaving_date
'4': I-leaving_date
'5': I-to_location
'6': B-to_location
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 871876
num_examples: 4480
- name: validation
num_bytes: 6900
num_examples: 35
- name: test
num_bytes: 70618
num_examples: 377
download_size: 131795
dataset_size: 949394
---
# Dataset Card for "bus_few4_64x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Yhyu13__oasst-rlhf-2-llama-30b-7k-steps-hf | ---
pretty_name: Evaluation run of Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf](https://huggingface.co/Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yhyu13__oasst-rlhf-2-llama-30b-7k-steps-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T02:17:36.805434](https://huggingface.co/datasets/open-llm-leaderboard/details_Yhyu13__oasst-rlhf-2-llama-30b-7k-steps-hf/blob/main/results_2023-09-18T02-17-36.805434.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094571,\n \"f1\": 0.07781564597315446,\n\
\ \"f1_stderr\": 0.0016061766920796063,\n \"acc\": 0.5511598739328604,\n\
\ \"acc_stderr\": 0.012142210957292902\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094571,\n\
\ \"f1\": 0.07781564597315446,\n \"f1_stderr\": 0.0016061766920796063\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3146322971948446,\n \
\ \"acc_stderr\": 0.012791037227336032\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249773\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T02_17_36.805434
path:
- '**/details_harness|drop|3_2023-09-18T02-17-36.805434.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T02-17-36.805434.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T02_17_36.805434
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-17-36.805434.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-17-36.805434.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:42:38.656530.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:42:38.656530.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:42:38.656530.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T02_17_36.805434
path:
- '**/details_harness|winogrande|5_2023-09-18T02-17-36.805434.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T02-17-36.805434.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_42_38.656530
path:
- results_2023-07-19T22:42:38.656530.parquet
- split: 2023_09_18T02_17_36.805434
path:
- results_2023-09-18T02-17-36.805434.parquet
- split: latest
path:
- results_2023-09-18T02-17-36.805434.parquet
---
# Dataset Card for Evaluation run of Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf](https://huggingface.co/Yhyu13/oasst-rlhf-2-llama-30b-7k-steps-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yhyu13__oasst-rlhf-2-llama-30b-7k-steps-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T02:17:36.805434](https://huggingface.co/datasets/open-llm-leaderboard/details_Yhyu13__oasst-rlhf-2-llama-30b-7k-steps-hf/blob/main/results_2023-09-18T02-17-36.805434.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094571,
"f1": 0.07781564597315446,
"f1_stderr": 0.0016061766920796063,
"acc": 0.5511598739328604,
"acc_stderr": 0.012142210957292902
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094571,
"f1": 0.07781564597315446,
"f1_stderr": 0.0016061766920796063
},
"harness|gsm8k|5": {
"acc": 0.3146322971948446,
"acc_stderr": 0.012791037227336032
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.011493384687249773
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16 | ---
pretty_name: Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-31T19:04:33.192118](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A04%3A33.192118.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2367148405069541,\n\
\ \"acc_stderr\": 0.030958077810881182,\n \"acc_norm\": 0.23838963087978138,\n\
\ \"acc_norm_stderr\": 0.030974710079953026,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193159,\n \"mc2\": 0.4693099566156165,\n\
\ \"mc2_stderr\": 0.01667201792733067\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21331058020477817,\n \"acc_stderr\": 0.011970971742326334,\n\
\ \"acc_norm\": 0.25426621160409557,\n \"acc_norm_stderr\": 0.012724999945157744\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28828918542123083,\n\
\ \"acc_stderr\": 0.00452040633108404,\n \"acc_norm\": 0.3461461860187214,\n\
\ \"acc_norm_stderr\": 0.004747682003491466\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614865,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614865\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891373,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891373\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2275132275132275,\n \"acc_stderr\": 0.02159126940782378,\n \"\
acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.02159126940782378\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2838709677419355,\n \"acc_stderr\": 0.025649381063029254,\n \"\
acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.025649381063029254\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617722,\n \"\
acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617722\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860702,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860702\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.020473233173551982,\n\
\ \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.020473233173551982\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863818,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863818\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.032162984205936135,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.032162984205936135\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21467889908256882,\n \"acc_stderr\": 0.01760430414925649,\n \"\
acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.01760430414925649\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.03005820270430985,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03005820270430985\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615767,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615767\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841043,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841043\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n\
\ \"acc_stderr\": 0.015594955384455772,\n \"acc_norm\": 0.2554278416347382,\n\
\ \"acc_norm_stderr\": 0.015594955384455772\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.20520231213872833,\n \"acc_stderr\": 0.021742519835276287,\n\
\ \"acc_norm\": 0.20520231213872833,\n \"acc_norm_stderr\": 0.021742519835276287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262203,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262203\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n\
\ \"acc_stderr\": 0.011025499291443738,\n \"acc_norm\": 0.24771838331160365,\n\
\ \"acc_norm_stderr\": 0.011025499291443738\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294275,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294275\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225378,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193159,\n \"mc2\": 0.4693099566156165,\n\
\ \"mc2_stderr\": 0.01667201792733067\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|arc:challenge|25_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hellaswag|10_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:04:33.192118.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T19:04:33.192118.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T19:04:33.192118.parquet'
- config_name: results
data_files:
- split: 2023_07_31T19_04_33.192118
path:
- results_2023-07-31T19:04:33.192118.parquet
- split: latest
path:
- results_2023-07-31T19:04:33.192118.parquet
---
# Dataset Card for Evaluation run of TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Vicuna-33B-1-3-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-31T19:04:33.192118](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-33B-1-3-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A04%3A33.192118.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2367148405069541,
"acc_stderr": 0.030958077810881182,
"acc_norm": 0.23838963087978138,
"acc_norm_stderr": 0.030974710079953026,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193159,
"mc2": 0.4693099566156165,
"mc2_stderr": 0.01667201792733067
},
"harness|arc:challenge|25": {
"acc": 0.21331058020477817,
"acc_stderr": 0.011970971742326334,
"acc_norm": 0.25426621160409557,
"acc_norm_stderr": 0.012724999945157744
},
"harness|hellaswag|10": {
"acc": 0.28828918542123083,
"acc_stderr": 0.00452040633108404,
"acc_norm": 0.3461461860187214,
"acc_norm_stderr": 0.004747682003491466
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614865,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614865
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891373,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891373
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.02159126940782378,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.02159126940782378
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.025649381063029254,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.025649381063029254
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617722,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617722
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860702,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860702
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.020473233173551982,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.020473233173551982
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276612,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276612
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863818,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863818
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.032162984205936135,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.032162984205936135
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.01760430414925649,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.01760430414925649
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03005820270430985,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03005820270430985
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615767,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615767
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841043,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841043
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455772,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455772
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20520231213872833,
"acc_stderr": 0.021742519835276287,
"acc_norm": 0.20520231213872833,
"acc_norm_stderr": 0.021742519835276287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262203,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262203
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443738,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294275,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294275
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225378,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193159,
"mc2": 0.4693099566156165,
"mc2_stderr": 0.01667201792733067
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vwxyzjn/ultrachat_200k_filtered_1707947544 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: query_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: test_sft
num_bytes: 1982888370.9168758
num_examples: 22991
- name: train_sft
num_bytes: 17846869528.524822
num_examples: 206698
download_size: 3299597538
dataset_size: 19829757899.441696
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': False,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=3000,
max_sft_query_response_length=4000,
max_sft_response_length=1500,
max_rm_query_response_length=4500),
'push_to_hub': True}
```
|
CyberHarem/izumo_no_okuni_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of izumo_no_okuni/出雲阿国/出云阿国 (Fate/Grand Order)
This is the dataset of izumo_no_okuni/出雲阿国/出云阿国 (Fate/Grand Order), containing 74 images and their tags.
The core tags of this character are `multicolored_hair, brown_hair, two-tone_hair, split-color_hair, long_hair, yellow_eyes, hair_ornament, white_hair, ribbon, sidelocks, breasts, hair_ribbon, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 128.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izumo_no_okuni_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 74 | 109.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izumo_no_okuni_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 179 | 211.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/izumo_no_okuni_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/izumo_no_okuni_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, kimono, looking_at_viewer, wide_sleeves, long_sleeves, solo, smile, gloves, thighhighs, obi, open_mouth, thighs, blush, hand_fan |
| 1 | 9 |  |  |  |  |  | 1girl, smile, white_kimono, looking_at_viewer, miko, solo, wide_sleeves, long_sleeves, blush, red_hakama, black_gloves, hakama_skirt, blunt_bangs, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | kimono | looking_at_viewer | wide_sleeves | long_sleeves | solo | smile | gloves | thighhighs | obi | open_mouth | thighs | blush | hand_fan | white_kimono | miko | red_hakama | black_gloves | hakama_skirt | blunt_bangs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------------------|:---------------|:---------------|:-------|:--------|:---------|:-------------|:------|:-------------|:---------|:--------|:-----------|:---------------|:-------|:-------------|:---------------|:---------------|:--------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | X | X | X | X | | | | X | | X | | X | X | X | X | X | X |
|
open-llm-leaderboard/details_LLM360__Amber | ---
pretty_name: Evaluation run of LLM360/Amber
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LLM360/Amber](https://huggingface.co/LLM360/Amber) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LLM360__Amber\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-19T04:59:05.791643](https://huggingface.co/datasets/open-llm-leaderboard/details_LLM360__Amber/blob/main/results_2023-12-19T04-59-05.791643.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2778470494306401,\n\
\ \"acc_stderr\": 0.03144370019620237,\n \"acc_norm\": 0.27870842542577673,\n\
\ \"acc_norm_stderr\": 0.032201431055323866,\n \"mc1\": 0.2141982864137087,\n\
\ \"mc1_stderr\": 0.014362148155690462,\n \"mc2\": 0.3355637385526089,\n\
\ \"mc2_stderr\": 0.013068282225164367\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39761092150170646,\n \"acc_stderr\": 0.014301752223279536,\n\
\ \"acc_norm\": 0.40955631399317405,\n \"acc_norm_stderr\": 0.014370358632472437\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5478988249352719,\n\
\ \"acc_stderr\": 0.004966832553245046,\n \"acc_norm\": 0.7379008165704043,\n\
\ \"acc_norm_stderr\": 0.00438877529821019\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03583496176361065,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03583496176361065\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566018,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566018\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307809,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307809\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.02834696377716246,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.02834696377716246\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378947,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378947\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.02306818884826111,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02306818884826111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.23870967741935484,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.16748768472906403,\n \"acc_stderr\": 0.026273086047535414,\n\
\ \"acc_norm\": 0.16748768472906403,\n \"acc_norm_stderr\": 0.026273086047535414\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.1919191919191919,\n \"acc_stderr\": 0.028057791672989024,\n \"\
acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.028057791672989024\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148533,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148533\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2074074074074074,\n \"acc_stderr\": 0.024720713193952165,\n \
\ \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.024720713193952165\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2018348623853211,\n \"acc_stderr\": 0.01720857935778755,\n \"\
acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.01720857935778755\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.030546745264953202,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.030546745264953202\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3235294117647059,\n \"acc_stderr\": 0.03283472056108567,\n \"\
acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.03283472056108567\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.19631901840490798,\n \"acc_stderr\": 0.031207970394709215,\n\
\ \"acc_norm\": 0.19631901840490798,\n \"acc_norm_stderr\": 0.031207970394709215\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.31196581196581197,\n\
\ \"acc_stderr\": 0.03035152732334496,\n \"acc_norm\": 0.31196581196581197,\n\
\ \"acc_norm_stderr\": 0.03035152732334496\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.016050792148036522,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.016050792148036522\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3265895953757225,\n \"acc_stderr\": 0.025248264774242832,\n\
\ \"acc_norm\": 0.3265895953757225,\n \"acc_norm_stderr\": 0.025248264774242832\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23687150837988827,\n\
\ \"acc_stderr\": 0.01421957078810399,\n \"acc_norm\": 0.23687150837988827,\n\
\ \"acc_norm_stderr\": 0.01421957078810399\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666787,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n\
\ \"acc_stderr\": 0.02645722506781103,\n \"acc_norm\": 0.3183279742765273,\n\
\ \"acc_norm_stderr\": 0.02645722506781103\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n \
\ \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2790091264667536,\n\
\ \"acc_stderr\": 0.011455208832803545,\n \"acc_norm\": 0.2790091264667536,\n\
\ \"acc_norm_stderr\": 0.011455208832803545\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.30718954248366015,\n \"acc_stderr\": 0.018663359671463663,\n \
\ \"acc_norm\": 0.30718954248366015,\n \"acc_norm_stderr\": 0.018663359671463663\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.02478907133200763,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.02478907133200763\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2141982864137087,\n\
\ \"mc1_stderr\": 0.014362148155690462,\n \"mc2\": 0.3355637385526089,\n\
\ \"mc2_stderr\": 0.013068282225164367\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6787687450670876,\n \"acc_stderr\": 0.013123599324558307\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \
\ \"acc_stderr\": 0.004548229533836332\n }\n}\n```"
repo_url: https://huggingface.co/LLM360/Amber
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|arc:challenge|25_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|gsm8k|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hellaswag|10_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T04-59-05.791643.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-19T04-59-05.791643.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- '**/details_harness|winogrande|5_2023-12-19T04-59-05.791643.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-19T04-59-05.791643.parquet'
- config_name: results
data_files:
- split: 2023_12_19T04_59_05.791643
path:
- results_2023-12-19T04-59-05.791643.parquet
- split: latest
path:
- results_2023-12-19T04-59-05.791643.parquet
---
# Dataset Card for Evaluation run of LLM360/Amber
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LLM360/Amber](https://huggingface.co/LLM360/Amber) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LLM360__Amber",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-19T04:59:05.791643](https://huggingface.co/datasets/open-llm-leaderboard/details_LLM360__Amber/blob/main/results_2023-12-19T04-59-05.791643.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2778470494306401,
"acc_stderr": 0.03144370019620237,
"acc_norm": 0.27870842542577673,
"acc_norm_stderr": 0.032201431055323866,
"mc1": 0.2141982864137087,
"mc1_stderr": 0.014362148155690462,
"mc2": 0.3355637385526089,
"mc2_stderr": 0.013068282225164367
},
"harness|arc:challenge|25": {
"acc": 0.39761092150170646,
"acc_stderr": 0.014301752223279536,
"acc_norm": 0.40955631399317405,
"acc_norm_stderr": 0.014370358632472437
},
"harness|hellaswag|10": {
"acc": 0.5478988249352719,
"acc_stderr": 0.004966832553245046,
"acc_norm": 0.7379008165704043,
"acc_norm_stderr": 0.00438877529821019
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03583496176361065,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03583496176361065
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566018,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566018
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307809,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307809
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.02834696377716246,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.02834696377716246
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378947,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378947
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02306818884826111,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02306818884826111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.16748768472906403,
"acc_stderr": 0.026273086047535414,
"acc_norm": 0.16748768472906403,
"acc_norm_stderr": 0.026273086047535414
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.028057791672989024,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.028057791672989024
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148533,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148533
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.024720713193952165,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.024720713193952165
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.01720857935778755,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.01720857935778755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.030546745264953202,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.030546745264953202
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.03283472056108567,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.03283472056108567
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.3511450381679389,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.19631901840490798,
"acc_stderr": 0.031207970394709215,
"acc_norm": 0.19631901840490798,
"acc_norm_stderr": 0.031207970394709215
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.31196581196581197,
"acc_stderr": 0.03035152732334496,
"acc_norm": 0.31196581196581197,
"acc_norm_stderr": 0.03035152732334496
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.016050792148036522,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.016050792148036522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3265895953757225,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.3265895953757225,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23687150837988827,
"acc_stderr": 0.01421957078810399,
"acc_norm": 0.23687150837988827,
"acc_norm_stderr": 0.01421957078810399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666787,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.02645722506781103,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.02645722506781103
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.02737412888263115,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.02737412888263115
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2790091264667536,
"acc_stderr": 0.011455208832803545,
"acc_norm": 0.2790091264667536,
"acc_norm_stderr": 0.011455208832803545
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.30718954248366015,
"acc_stderr": 0.018663359671463663,
"acc_norm": 0.30718954248366015,
"acc_norm_stderr": 0.018663359671463663
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.02478907133200763,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.02478907133200763
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2141982864137087,
"mc1_stderr": 0.014362148155690462,
"mc2": 0.3355637385526089,
"mc2_stderr": 0.013068282225164367
},
"harness|winogrande|5": {
"acc": 0.6787687450670876,
"acc_stderr": 0.013123599324558307
},
"harness|gsm8k|5": {
"acc": 0.028051554207733132,
"acc_stderr": 0.004548229533836332
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_107 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 888248480.0
num_examples: 174440
download_size: 906076290
dataset_size: 888248480.0
---
# Dataset Card for "chunk_107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
approximatelabs/tablib-v1-sample | ---
license: other
pretty_name: TabLib
size_categories:
- 1M<n<10M
extra_gated_prompt: >-
Access to this dataset is automatically granted once this form is completed.
Note that this access request is for the TabLib sample, not [the full TabLib dataset](https://huggingface.co/datasets/approximatelabs/tablib-v1-full).
extra_gated_fields:
I agree to abide by the license requirements of the data contained in TabLib: checkbox
---
[](https://discord.gg/kW9nBQErGe)
<img src="https://approximatelabs.com/tablib.png" width="800" />
# TabLib Sample
**NOTE**: This is a 0.1% sample of [the full TabLib dataset](https://huggingface.co/datasets/approximatelabs/tablib-v1-full).
TabLib is a minimally-preprocessed dataset of 627M tables (69 TiB) extracted from HTML, PDF, CSV, TSV, Excel, and SQLite files from GitHub and Common Crawl.
This includes 867B tokens of "context metadata": each table includes provenance information and table context such as filename, text before/after, HTML metadata, etc.
For more information, read the [paper](https://arxiv.org/abs/2310.07875) & [announcement blog](https://approximatelabs.com/blog/tablib).
# Dataset Details
## Sources
* **GitHub**: nearly all public GitHub repositories
* **Common Crawl**: the `CC-MAIN-2023-23` crawl
## Reading Tables
Tables are stored as serialized Arrow bytes in the `arrow_bytes` column. To read these, you will need to deserialize the bytes:
```python
import datasets
import pyarrow as pa
# load a single file of the dataset
ds = datasets.load_dataset(
'approximatelabs/tablib-v1-sample',
token='...',
)
df = ds['train'].to_pandas()
tables = [pa.RecordBatchStreamReader(b).read_all() for b in df['arrow_bytes']]
```
## Licensing
This dataset is intended for research use only.
For specific licensing information, refer to the license of the specific datum being used.
# Contact
If you have any questions, comments, or concerns about licensing, pii, etc. please contact using [this form](https://forms.gle/C74VTWP7L78QDVR67).
# Approximate Labs
TabLib is a project from Approximate Labs. Find us on [Twitter](https://twitter.com/approximatelabs), [Github](https://github.com/approximatelabs), [Linkedin](https://www.linkedin.com/company/approximate-labs), and [Discord](https://discord.gg/kW9nBQErGe).
# Citations
If you use TabLib for any of your research, please cite the TabLib paper:
```
@misc{eggert2023tablib,
title={TabLib: A Dataset of 627M Tables with Context},
author={Gus Eggert and Kevin Huo and Mike Biven and Justin Waugh},
year={2023},
eprint={2310.07875},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
NobodyExistsOnTheInternet/sysmsgalpacatest | ---
license: mit
---
|
BangumiBase/inuninattarasukinahitonihirowareta | ---
license: mit
tags:
- art
size_categories:
- n<1K
---
# Bangumi Image Base of Inu Ni Nattara Suki Na Hito Ni Hirowareta
This is the image base of bangumi Inu ni Nattara Suki na Hito ni Hirowareta, we detected 9 characters, 406 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 67 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 92 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 14 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 11 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 23 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 32 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 74 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 44 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 49 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
CyberHarem/zara_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zara/ザラ (Kantai Collection)
This is the dataset of zara/ザラ (Kantai Collection), containing 406 images and their tags.
The core tags of this character are `long_hair, blonde_hair, braid, wavy_hair, breasts, french_braid, hat, large_breasts, mini_hat, brown_eyes, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 406 | 421.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 406 | 270.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 906 | 560.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 406 | 385.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 906 | 747.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zara_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, red_ascot, solo, upper_body, bare_shoulders, corset, white_shirt, blush, side_braid, simple_background, smile, white_background, clothing_cutout, open_mouth, one-hour_drawing_challenge, one_eye_closed, twitter_username, yellow_eyes |
| 1 | 5 |  |  |  |  |  | 1girl, corset, one-hour_drawing_challenge, red_ascot, red_skirt, simple_background, solo, twitter_username, white_background, white_shirt, clothing_cutout, side_braid, bangs, bare_shoulders, blush, cowboy_shot, long_sleeves, looking_at_viewer, open_mouth, dated, purple_eyes |
| 2 | 6 |  |  |  |  |  | 1girl, ascot, long_sleeves, looking_at_viewer, red_skirt, solo, white_shirt, corset, miniskirt, smile, bare_shoulders |
| 3 | 6 |  |  |  |  |  | 1girl, ascot, bare_shoulders, simple_background, solo, white_background, white_shirt, blush, long_sleeves, upper_body, looking_at_viewer, purple_eyes |
| 4 | 10 |  |  |  |  |  | 1girl, bare_shoulders, solo, miniskirt, looking_at_viewer, open_mouth, purple_eyes, smile, black_pantyhose, twitter_username |
| 5 | 6 |  |  |  |  |  | 1girl, blush, cleavage, collarbone, looking_at_viewer, navel, solo, bangs, side_braid, simple_background, cowboy_shot, white_background, red_bikini |
| 6 | 9 |  |  |  |  |  | 1girl, fake_animal_ears, looking_at_viewer, rabbit_ears, solo, wrist_cuffs, alternate_costume, blush, cleavage, playboy_bunny, simple_background, white_background, detached_collar, rabbit_tail, strapless, black_pantyhose, cowboy_shot, red_bowtie, black_leotard, twitter_username |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | red_ascot | solo | upper_body | bare_shoulders | corset | white_shirt | blush | side_braid | simple_background | smile | white_background | clothing_cutout | open_mouth | one-hour_drawing_challenge | one_eye_closed | twitter_username | yellow_eyes | red_skirt | bangs | cowboy_shot | dated | purple_eyes | ascot | miniskirt | black_pantyhose | cleavage | collarbone | navel | red_bikini | fake_animal_ears | rabbit_ears | wrist_cuffs | alternate_costume | playboy_bunny | detached_collar | rabbit_tail | strapless | red_bowtie | black_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:------------|:-------|:-------------|:-----------------|:---------|:--------------|:--------|:-------------|:--------------------|:--------|:-------------------|:------------------|:-------------|:-----------------------------|:-----------------|:-------------------|:--------------|:------------|:--------|:--------------|:--------|:--------------|:--------|:------------|:------------------|:-----------|:-------------|:--------|:-------------|:-------------------|:--------------|:--------------|:--------------------|:----------------|:------------------|:--------------|:------------|:-------------|:----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | | X | X | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | | X | | | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | | X | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | X | | X | | X | | | | | | X | | | X | | | X | | | | | | X | | X | X | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | X | | | | | X | X | X | | X | | | | | | | | X | X | | | | | | X | X | X | X | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | | X | | | | | X | | X | | X | | | | | X | | | | X | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X |
|
MustafaSuleyman/real-toxicity-prompts | ---
license: cc0-1.0
tags:
- ChatGPT
---
<p align="center"><h1>🧠 Awesome ChatGPT Prompts [CSV dataset]</h1></p>
This is a Dataset Repository of **Awesome ChatGPT Prompts**
**[View All Prompts on GitHub](https://github.com/f/awesome-chatgpt-prompts)**
# License
CC-0
|
AdapterOcean/med_alpaca_standardized_cluster_91 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 121262616
num_examples: 11900
download_size: 36404163
dataset_size: 121262616
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_91"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
itecgo/Topical-Chat-chatml | ---
dataset_info:
features:
- name: id
dtype: string
- name: content
list:
- name: role
dtype: string
- name: content
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 43854444
num_examples: 8628
download_size: 24801759
dataset_size: 43854444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ericflo/unnaturalhermes-reflections-100k | ---
license: apache-2.0
---
|
arianhosseini/openai_summarize_comparisons_relabel_pythia1b_iter1_temp0.7 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 35977664
num_examples: 20000
download_size: 21784615
dataset_size: 35977664
---
# Dataset Card for "openai_summarize_comparisons_relabel_pythia1b_iter1_temp0.7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
goodfellowliu/BSDS200 | ---
license: apache-2.0
---
|
keremberke/garbage-object-detection | ---
task_categories:
- object-detection
tags:
- roboflow
---
### Roboflow Dataset Page
[https://universe.roboflow.com/material-identification/garbage-classification-3/dataset/2](https://universe.roboflow.com/material-identification/garbage-classification-3/dataset/2?ref=roboflow2huggingface)
### Dataset Labels
```
['biodegradable', 'cardboard', 'glass', 'metal', 'paper', 'plastic']
```
### Citation
```
@misc{ garbage-classification-3_dataset,
title = { GARBAGE CLASSIFICATION 3 Dataset },
type = { Open Source Dataset },
author = { Material Identification },
howpublished = { \\url{ https://universe.roboflow.com/material-identification/garbage-classification-3 } },
url = { https://universe.roboflow.com/material-identification/garbage-classification-3 },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { mar },
note = { visited on 2023-01-02 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on July 27, 2022 at 5:44 AM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
It includes 10464 images.
GARBAGE-GARBAGE-CLASSIFICATION are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 416x416 (Stretch)
The following augmentation was applied to create 1 versions of each source image:
* 50% probability of horizontal flip
* 50% probability of vertical flip
* Equal probability of one of the following 90-degree rotations: none, clockwise, counter-clockwise, upside-down
|
Gummybear05/E50_Yspeed | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 11195854719
num_examples: 12401
download_size: 5511265055
dataset_size: 11195854719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
eckendoerffer/wikipedia_fr | ---
license: cc-by-sa-3.0
task_categories:
- text-generation
language:
- fr
tags:
- wikipedia
- wiki
- fr.wikipedia.org
size_categories:
- 1M<n<10M
---
# French Wikipedia Dataset
## Overview
This dataset is a curated collection of approximately 1.1 million French Wikipedia articles, scraped directly from the [official French Wikipedia site](https://fr.wikipedia.org/) on September 24, 2023.
There are already numerous datasets for Wikipedia, including the official one with [Wikipedia's dump](https://huggingface.co/datasets/wikipedia). Unfortunately, the text for the French version of this dataset is incomplete, lacking many elements like dates and locations.
As the saying goes, "garbage in, garbage out."
## Format
- **Type**: Text
- **File Extension**: `.txt`
## Structure
The dataset is divided into the following splits:
- `train.txt`: 3.45 GB - 1,810,000 rows - 90%
- `test.txt` : 192 MB - 100,575 rows - 5%
- `valid.txt`: 192 MB - 100,575 rows - 5%
Each article in the dataset exceeds 1400 characters in length.
## Data Cleaning and Preprocessing
The following elements have been excluded from the dataset:
- H1 - H4 Headings
- Lists
- Tables
- Sources and References
- Info box
- Banners
- LaTeX code
The text has been standardized for consistent formatting and line length. Additionally, the dataset has been filtered using the `langid` library to include only text in French. Some quotations or short terms in other languages, including non-Latin languages, may still be present.
## Exploring the Dataset
You can use the `explore_dataset.py` script to explore the dataset by randomly displaying a certain number of lines from it. The script creates and saves an index based on the line breaks, enabling faster data retrieval and display.
## Additional Information
This dataset is a subset of a larger 10GB French dataset, which also contains several thousand books and theses in French, as well as several hundred thousand Francophone news articles.
---
# WIKIPEDIA EXTRACT
Inside the `/extract_wiki/` directory, you'll find Python scripts used to extract text to compile this dataset.
## Requirements:
```python
pip install datasets aiohttp aiofiles beautifulsoup4 langid
```
## Scripts:
1. **1_extract_link.py**
```python
python 1_extract_link.py
```
Script to download the Wikipedia dataset from Hugging Face, extract URLs, and save them to a text file for further processing.
2. **2_extract_content.py**
```python
python 2_extract_content.py
```
This script retrieves the source code of Wikipedia pages based on URLs found in a text file. Instead of saving the entire HTML of the page, it trims the content, focusing on the main article section, thereby limiting the size of each record.
3. **3_extract_txt.py**
```python
python 3_extract_txt.py
```
This script extracts the text from the HTML pages and conducts tests to filter the content that should be retained or excluded. This includes language checks, special characters, numbers, etc.
|
deetsadi/processed_dwi_cropped | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 12023715.0
num_examples: 200
download_size: 11594705
dataset_size: 12023715.0
---
# Dataset Card for "processed_dwi_cropped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zaid/xquad_tr | ---
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 979782.9050420168
num_examples: 963
- name: validation
num_bytes: 121073.9
num_examples: 119
- name: test
num_bytes: 109882.1949579832
num_examples: 108
download_size: 353715
dataset_size: 1210739.0
---
# Dataset Card for "xquad_tr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_22-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_22-7B-slerp](https://huggingface.co/Gille/StrangeMerges_22-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T01:26:13.113566](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp/blob/main/results_2024-02-13T01-26-13.113566.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543671954489716,\n\
\ \"acc_stderr\": 0.03205804055740569,\n \"acc_norm\": 0.6535998321857869,\n\
\ \"acc_norm_stderr\": 0.03273084993490379,\n \"mc1\": 0.6009791921664627,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7490450940434222,\n\
\ \"mc2_stderr\": 0.014305107509742374\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n\
\ \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351335\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.719577773351922,\n\
\ \"acc_stderr\": 0.004482874732237349,\n \"acc_norm\": 0.8902609042023502,\n\
\ \"acc_norm_stderr\": 0.003119254828848947\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n\
\ \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.012751977967676013,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.012751977967676013\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6009791921664627,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7490450940434222,\n\
\ \"mc2_stderr\": 0.014305107509742374\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065604\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \
\ \"acc_stderr\": 0.012652544133186141\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_22-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|arc:challenge|25_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|gsm8k|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hellaswag|10_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T01-26-13.113566.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- '**/details_harness|winogrande|5_2024-02-13T01-26-13.113566.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T01-26-13.113566.parquet'
- config_name: results
data_files:
- split: 2024_02_13T01_26_13.113566
path:
- results_2024-02-13T01-26-13.113566.parquet
- split: latest
path:
- results_2024-02-13T01-26-13.113566.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_22-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_22-7B-slerp](https://huggingface.co/Gille/StrangeMerges_22-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T01:26:13.113566](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp/blob/main/results_2024-02-13T01-26-13.113566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543671954489716,
"acc_stderr": 0.03205804055740569,
"acc_norm": 0.6535998321857869,
"acc_norm_stderr": 0.03273084993490379,
"mc1": 0.6009791921664627,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.7490450940434222,
"mc2_stderr": 0.014305107509742374
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.719577773351922,
"acc_stderr": 0.004482874732237349,
"acc_norm": 0.8902609042023502,
"acc_norm_stderr": 0.003119254828848947
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.01663583834163192,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.01663583834163192
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676013,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676013
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6009791921664627,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.7490450940434222,
"mc2_stderr": 0.014305107509742374
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065604
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
diplomacy_detection | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- intent-classification
pretty_name: HateOffensive
dataset_info:
features:
- name: messages
sequence: string
- name: sender_labels
sequence:
class_label:
names:
'0': 'false'
'1': 'true'
- name: receiver_labels
sequence:
class_label:
names:
'0': 'false'
'1': 'true'
'2': noannotation
- name: speakers
sequence:
class_label:
names:
'0': italy
'1': turkey
'2': russia
'3': england
'4': austria
'5': germany
'6': france
- name: receivers
sequence:
class_label:
names:
'0': italy
'1': turkey
'2': russia
'3': england
'4': austria
'5': germany
'6': france
- name: absolute_message_index
sequence: int64
- name: relative_message_index
sequence: int64
- name: seasons
sequence:
class_label:
names:
'0': spring
'1': fall
'2': winter
'3': Spring
'4': Fall
'5': Winter
- name: years
sequence:
class_label:
names:
'0': '1901'
'1': '1902'
'2': '1903'
'3': '1904'
'4': '1905'
'5': '1906'
'6': '1907'
'7': '1908'
'8': '1909'
'9': '1910'
'10': '1911'
'11': '1912'
'12': '1913'
'13': '1914'
'14': '1915'
'15': '1916'
'16': '1917'
'17': '1918'
- name: game_score
sequence:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
'10': '10'
'11': '11'
'12': '12'
'13': '13'
'14': '14'
'15': '15'
'16': '16'
'17': '17'
'18': '18'
- name: game_score_delta
sequence:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
'10': '10'
'11': '11'
'12': '12'
'13': '13'
'14': '14'
'15': '15'
'16': '16'
'17': '17'
'18': '18'
'19': '-1'
'20': '-2'
'21': '-3'
'22': '-4'
'23': '-5'
'24': '-6'
'25': '-7'
'26': '-8'
'27': '-9'
'28': '-10'
'29': '-11'
'30': '-12'
'31': '-13'
'32': '-14'
'33': '-15'
'34': '-16'
'35': '-17'
'36': '-18'
- name: players
sequence:
class_label:
names:
'0': italy
'1': turkey
'2': russia
'3': england
'4': austria
'5': germany
'6': france
- name: game_id
dtype: int64
splits:
- name: validation
num_bytes: 254344
num_examples: 21
- name: train
num_bytes: 2539778
num_examples: 189
- name: test
num_bytes: 506191
num_examples: 42
download_size: 3208706
dataset_size: 3300313
---
# Dataset Card for HateOffensive
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage** : https://sites.google.com/view/qanta/projects/diplomacy
- **Repository** : https://github.com/DenisPeskov/2020_acl_diplomacy
- **Paper** : http://users.umiacs.umd.edu/~jbg/docs/2020_acl_diplomacy.pdf
- **Leaderboard** :
- **Point of Contact** :
### Dataset Summary
This dataset contains pairwise conversations annotated by the sender and the receiver for deception (and conversely truthfulness). The 17,289 messages are gathered from 12 games.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
### Data Instances
```
{
"messages":
["Greetings Sultan!\n\nAs your neighbor I would like to propose an alliance! What are your views on the board so far?", "I think an alliance would be great! Perhaps a dmz in the Black Sea would be a good idea to solidify this alliance?\n\nAs for my views on the board, my first moves will be Western into the Balkans and Mediterranean Sea.", "Sounds good lets call a dmz in the black sea", "What's our move this year?", "I've been away from the game for a while", "Not sure yet, what are your thoughts?", "Well I'm pretty worried about Germany attacking me (and Austria to a lesser extent) so im headed west. It looks like Italy's landing a army in Syr this fall unless you can stop it", "That sounds good to me. I'll move to defend against Italy while you move west. If it's not too much too ask, I'd like to request that you withdraw your fleet from bla.", "Oh sorry missed the msg to move out of bl sea ill do that this turn. I did bring my army down into Armenia, To help you expel the Italian. It looks like Austria and Italy are working together. If we have a chance in the region you should probably use smy to protect con. We can't afford to lose con.", "I'll defend con from both ank and smy.", "Hey sorry for stabbing you earlier, it was an especially hard choice since Turkey is usually my country of choice. It's cool we got to do this study huh?"],
"sender_labels": [false, true, false, true, true, true, true, true, true, true, true],
"receiver_labels": [true, true, true, true, true, true, true, true, true, true, "NOANNOTATION"],
"speakers": ["russia", "turkey", "russia", "russia", "russia", "turkey", "russia", "turkey", "russia", "turkey", "russia"],
"receivers": ["turkey", "russia", "turkey", "turkey", "turkey", "russia", "turkey", "russia", "turkey", "russia", "turkey"],
"absolute_message_index": [78, 107, 145, 370, 371, 374, 415, 420, 495, 497, 717],
"relative_message_index": [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
"seasons": ["Spring", "Spring", "Spring", "Spring", "Spring", "Spring", "Fall", "Fall", "Spring", "Spring", "Fall"],
"years": ["1901", "1901", "1901", "1902", "1902", "1902", "1902", "1902", "1903", "1903", "1905"],
"game_score": ["4", "3", "4", "5", "5", "4", "5", "4", "5", "3", "7"],
"game_score_delta": ["1", "-1", "1", "1", "1", "-1", "1", "-1", "2", "-2", "7"],
"players": ["russia", "turkey"],
"game_id": 10
}
```
### Data Fields
- speakers: the sender of the message (string format. Seven possible values: russia, turkey, england, austria, germany, france, italy)
- receivers: the receiver of the message (string format. Seven possible values: russia, turkey, england, austria, germany, france, italy)
- messages: the raw message string (string format. ranges in length from one word to paragraphs in length)
- sender_labels: indicates if the sender of the message selected that the message is truthful, true, or deceptive, false. This is used for our ACTUAL_LIE calculation (true/false which can be bool or string format)
- receiver_labels: indicates if the receiver of the message selected that the message is perceived as truthful, true, or deceptive, false. In <10% of the cases, no annotation was received. This is used for our SUSPECTED_LIE calculation (string format. true/false/"NOANNOTATION" )
- game_score: the current game score---supply centers---of the sender (string format that ranges can range from 0 to 18)
- game_score_delta: the current game score---supply centers---of the sender minus the game score of the recipient (string format that ranges from -18 to 18)
- absolute_message_index: the index the message is in the entire game, across all dialogs (int format)
- relative_message_index: the index of the message in the current dialog (int format)
- seasons: the season in Diplomacy, associated with the year (string format. Spring, Fall, Winter)
- years: the year in Diplomacy, associated with the season (string format. 1901 through 1918)
- game_id: which of the 12 games the dialog comes from (int format ranging from 1 to 12)
### Data Splits
Train, Test and Validation splits
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Unknown
### Citation Information
@inproceedings{Peskov:Cheng:Elgohary:Barrow:Danescu-Niculescu-Mizil:Boyd-Graber-2020,
Title = {It Takes Two to Lie: One to Lie and One to Listen},
Author = {Denis Peskov and Benny Cheng and Ahmed Elgohary and Joe Barrow and Cristian Danescu-Niculescu-Mizil and Jordan Boyd-Graber},
Booktitle = {Association for Computational Linguistics},
Year = {2020},
Location = {Seattle},
}
### Contributions
Thanks to [@MisbahKhan789](https://github.com/MisbahKhan789) for adding this dataset. |
open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-TIES-v0.1 | ---
pretty_name: Evaluation run of MaziyarPanahi/WizardLM-Math-70B-TIES-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/WizardLM-Math-70B-TIES-v0.1](https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-TIES-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-TIES-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T06:49:50.553009](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-TIES-v0.1/blob/main/results_2024-02-18T06-49-50.553009.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6868282613819305,\n\
\ \"acc_stderr\": 0.030371866427473967,\n \"acc_norm\": 0.695311288530275,\n\
\ \"acc_norm_stderr\": 0.030984285786669577,\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.5360987678643523,\n\
\ \"mc2_stderr\": 0.014938153988985473\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916573,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6836287592113125,\n\
\ \"acc_stderr\": 0.004641092001425294,\n \"acc_norm\": 0.8686516630153356,\n\
\ \"acc_norm_stderr\": 0.0033709059327855567\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\"\
: 0.4312169312169312,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172527,\n \"\
acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172527\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216773,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216773\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678185,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678185\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.02543511943810537,\n \
\ \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.02543511943810537\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8862385321100917,\n \"acc_stderr\": 0.0136136148002328,\n \"acc_norm\"\
: 0.8862385321100917,\n \"acc_norm_stderr\": 0.0136136148002328\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5879629629629629,\n\
\ \"acc_stderr\": 0.03356787758160831,\n \"acc_norm\": 0.5879629629629629,\n\
\ \"acc_norm_stderr\": 0.03356787758160831\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813895,\n\
\ \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813895\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n\
\ \"acc_stderr\": 0.025998379092356513,\n \"acc_norm\": 0.8161434977578476,\n\
\ \"acc_norm_stderr\": 0.025998379092356513\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.03154521672005472,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.03154521672005472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709225,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709225\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573975,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573975\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795663,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.022497230190967558,\n\
\ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.022497230190967558\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5452513966480447,\n\
\ \"acc_stderr\": 0.016653875777523995,\n \"acc_norm\": 0.5452513966480447,\n\
\ \"acc_norm_stderr\": 0.016653875777523995\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.02058146613825712,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.02058146613825712\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5658409387222947,\n\
\ \"acc_stderr\": 0.012659033237067253,\n \"acc_norm\": 0.5658409387222947,\n\
\ \"acc_norm_stderr\": 0.012659033237067253\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7683823529411765,\n \"acc_stderr\": 0.025626533803777562,\n\
\ \"acc_norm\": 0.7683823529411765,\n \"acc_norm_stderr\": 0.025626533803777562\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7532679738562091,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.7532679738562091,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.5360987678643523,\n\
\ \"mc2_stderr\": 0.014938153988985473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971855\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27369219105382864,\n \
\ \"acc_stderr\": 0.012281003490963456\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-TIES-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|arc:challenge|25_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|gsm8k|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hellaswag|10_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-49-50.553009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T06-49-50.553009.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- '**/details_harness|winogrande|5_2024-02-18T06-49-50.553009.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T06-49-50.553009.parquet'
- config_name: results
data_files:
- split: 2024_02_18T06_49_50.553009
path:
- results_2024-02-18T06-49-50.553009.parquet
- split: latest
path:
- results_2024-02-18T06-49-50.553009.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/WizardLM-Math-70B-TIES-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/WizardLM-Math-70B-TIES-v0.1](https://huggingface.co/MaziyarPanahi/WizardLM-Math-70B-TIES-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-TIES-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T06:49:50.553009](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__WizardLM-Math-70B-TIES-v0.1/blob/main/results_2024-02-18T06-49-50.553009.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6868282613819305,
"acc_stderr": 0.030371866427473967,
"acc_norm": 0.695311288530275,
"acc_norm_stderr": 0.030984285786669577,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.5360987678643523,
"mc2_stderr": 0.014938153988985473
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6836287592113125,
"acc_stderr": 0.004641092001425294,
"acc_norm": 0.8686516630153356,
"acc_norm_stderr": 0.0033709059327855567
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.0255064816981382,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.0255064816981382
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172527,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172527
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216773,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216773
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678185,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.02543511943810537,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.02543511943810537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8862385321100917,
"acc_stderr": 0.0136136148002328,
"acc_norm": 0.8862385321100917,
"acc_norm_stderr": 0.0136136148002328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813895,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813895
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.025998379092356513,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.025998379092356513
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.03154521672005472,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.03154521672005472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709225,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709225
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573975,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795663,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.022497230190967558,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.022497230190967558
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5452513966480447,
"acc_stderr": 0.016653875777523995,
"acc_norm": 0.5452513966480447,
"acc_norm_stderr": 0.016653875777523995
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.02058146613825712,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.02058146613825712
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5658409387222947,
"acc_stderr": 0.012659033237067253,
"acc_norm": 0.5658409387222947,
"acc_norm_stderr": 0.012659033237067253
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7683823529411765,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.7683823529411765,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7532679738562091,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.7532679738562091,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.5360987678643523,
"mc2_stderr": 0.014938153988985473
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971855
},
"harness|gsm8k|5": {
"acc": 0.27369219105382864,
"acc_stderr": 0.012281003490963456
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jxu124/guesswhat | ---
license: apache-2.0
dataset_info:
features:
- name: image_raw
dtype: image
- name: status
dtype: string
- name: picture
struct:
- name: coco_url
dtype: string
- name: file_name
dtype: string
- name: flickr_url
dtype: string
- name: height
dtype: int64
- name: width
dtype: int64
- name: picture_id
dtype: int64
- name: qas
list:
- name: q
dtype: string
- name: a
dtype: string
- name: id
dtype: int64
- name: questioner_id
dtype: int64
- name: timestamp
dtype: string
- name: object_id
dtype: int64
- name: dialogue_id
dtype: int64
- name: objects
struct:
- name: objects_keys
sequence: string
- name: objects_values
list:
- name: area
dtype: float64
- name: bbox
sequence: float64
- name: category
dtype: string
- name: category_id
dtype: int64
- name: iscrowd
dtype: bool
- name: object_id
dtype: int64
- name: segment
sequence:
sequence: float64
splits:
- name: train
num_bytes: 17727639600.26
num_examples: 108860
- name: test
num_bytes: 3858218992.82
num_examples: 23115
- name: validation
num_bytes: 3885120224.34
num_examples: 23305
download_size: 25497584790
dataset_size: 25470978817.42
---
Origin dataset can be accessed from [here](https://github.com/GuessWhatGame/guesswhat). |
chronbmm/sanskrit-sandhi-split-hackathon | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: unsandhied
dtype: string
splits:
- name: train
num_bytes: 9350944
num_examples: 89323
- name: validation
num_bytes: 1164083
num_examples: 10235
- name: test
num_bytes: 1169683
num_examples: 9965
- name: test_500
num_bytes: 62539
num_examples: 500
- name: validation_500
num_bytes: 53738
num_examples: 500
download_size: 7114072
dataset_size: 11800987
---
# Dataset Card for "sanskrit-sandhi-split-hackathon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
boopysaur/user1-raw-small | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1800075
num_examples: 24237
download_size: 1284193
dataset_size: 1800075
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_78_1713163932 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 273910
num_examples: 662
download_size: 140741
dataset_size: 273910
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
IDEA-CCNL/Ziya-Visual-Eval-Chinese | ---
license: apache-2.0
language:
- zh
pretty_name: Ziya-Visual-Eval-Chinese
size_categories:
- n<1K
---
# 姜子牙-Visual中文评估数据集 Ziya-Visual-Eval-Chinese
### 数据介绍 Dataset Summary
数据集由[LLaVA](https://github.com/haotian-liu/LLaVA)评估集翻译而来,图片源来自coco数据集,用于评估多模态大模型的中文能力
Dataset translated from the [LLaVA](https://github.com/haotian-liu/LLaVA) evaluation set, image source from the coco dataset, used to evaluate the Chinese language capabilities of the multimodal large model.
### 语言 Languages
中文
Chinese
### 数据示例 Data Instances
```json
{"question_id": 0, "image": "000000441147.jpg", "text": "图片中两个手提箱的颜色是什么?", "category": "conv"}
```
### 数据字段 Data Fields
- id: int
- image: str
- text: str
- category: str
### 引用 Citation
```
@article{fengshenbang,
author = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen},
title = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
journal = {CoRR},
volume = {abs/2209.02970},
year = {2022}
}
```
|
jbrinkma/pile-100k | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: pile_set_name
dtype: string
splits:
- name: train
num_bytes: 553878190
num_examples: 100000
download_size: 289953878
dataset_size: 553878190
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
m-a-p/COIG-CQIA | ---
configs:
- config_name: "chinese_traditional"
data_files:
- split: train
path: chinese_traditional/*
- config_name: "coig_pc"
data_files:
- split: train
path: coig_pc/*
- config_name: "exam"
data_files:
- split: train
path: exam/*
- config_name: "finance"
- config_name: "douban"
data_files:
- split: train
path: douban/*
- config_name: "finance"
data_files:
- split: train
path: finance/*
- config_name: "human_value"
data_files:
- split: train
path: human_value/*
- config_name: "logi_qa"
data_files:
- split: train
path: logi_qa/*
- config_name: "ruozhiba"
data_files:
- split: train
path: ruozhiba/*
- config_name: "segmentfault"
data_files:
- split: train
path: segmentfault/*
- config_name: "wiki"
data_files:
- split: train
path: wiki/*
- config_name: "wikihow"
data_files:
- split: train
path: wikihow/*
- config_name: "xhs"
data_files:
- split: train
path: xhs/*
- config_name: "zhihu"
data_files:
- split: train
path: zhihu/*
task_categories:
- question-answering
- text-classification
- text-generation
- text2text-generation
language:
- zh
size_categories:
- 10K<n<100K
---
<div align="center">
<img src="Yi_logo.svg" width="150px" style="display: inline-block;">
<img src="siat-logo.jpg" width="150px" style="display: inline-block;">
<img src="m-a-p.png" width="150px" style="display: inline-block;">
</div>
# COIG-CQIA:Quality is All you need for Chinese Instruction Fine-tuning
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
欢迎来到COIG-CQIA,COIG-CQIA全称为**Chinese Open Instruction Generalist - Quality is All You Need**, 是一个开源的高质量指令微调数据集,旨在为中文NLP社区提供**高质量**且符合**人类交互行为**的指令微调数据。COIG-CQIA以中文互联网获取到的问答及文章作为原始数据,经过深度清洗、重构及人工审核构建而成。本项目受*LIMA: Less Is More for Alignment*等研究启发,使用少量高质量的数据即可让大语言模型学习到人类交互行为,因此在数据构建中我们十分注重数据的来源、质量与多样性,数据集详情请见数据介绍以及我们接下来的论文。
Welcome to the COIG-CQIA project page. COIG-CQIA stands for **Chinese Open Instruction Generalist - Quality is All You Need**, a high-quality Chinese instruction fine-tuning dataset. This dataset is designed to provide the Chinese NLP community with **high-quality** and **human interaction-aligned** instruction fine-tuning data.Inspired by studies like *LIMA: Less Is More for Alignment*, COIG-CQIA focuses on creating a dataset from Chinese internet sources including Q&A and articles. These are deeply cleansed, restructured, and manually reviewed to ensure quality, diversity, and relevance.
- **Curated by:** 来自零一万物、中科院深圳先进技术研究院,和M-A-P等机构的研究者们。
- **Language(s) (NLP):** 本数据集主要语言为中文。
- **License:** [More Information Needed]
本数据集当前为v0.1版本,如果您在使用中发现数据集存在问题或者有可以改进的地方,欢迎留言交流!
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
本数据集适用于指令微调,训练模型具备响应指令的能力。
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## 数据
### 数据格式
```json
{
"instruction": "示例问题或者指令。",
"input": "示例问题或指令的补充。",
"output": "对输入的回复。",
"task_type": {
"major": ["问答"],
"minor": ["百科问答"]
},
"domain": ["百科", "医疗"],
"answer_from": "human",
"human_verified": true,
"copyright": "作者及版权信息。",
}
```
### 数据字段
- `instruction`: 用于输入的指令或者问题。
- `input`: 问题或指令的补充内容。
- `output`: 输入对应的回答。
- `task_type`: 表示该数据所属的主要任务类型和细分任务类型。
- `domain`: 该数据所属领域。
- `answer_from`: 回答是人类撰写的还是大模型撰写的,本数据集中绝大部分是由人类撰写的回答,少部分由大模型生成(经过了人工验证)。
- `human_verified`: 该数据是否有人类核验过。
- `copyright`: 包括该数据的版权信息,包括作者等。
当前版本的数据字段中仍有不完善的部分,我们将在近期的下一版本中补充。
### 数据详情
<details>
<summary><b>社交媒体&论坛</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| 知乎 | 8837 | [[网址链接]](https://www.zhihu.com/) | 经过多阶段的数据质量筛选和人工验证。 |
| 豆瓣 | 3132 | [[网址链接]](https://www.douban.com/) | 人工撰写多样的prompt模板构造而成。 |
| 小红书 | 1508 | [[网址链接]](https://www.xiaohongshu.com/explore) | 人工撰写多样的prompt模板构造而成。 |
| Segmentfault | 458 | [[网址链接]](https://segmentfault.com/) | 规则方式清洗与筛选,并经过人工验证。 |
| **总量** | **13935** | - | - |
</details>
<details>
<summary><b>通用百科</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| 百科文章 | 980 | 从网络中收集。[[网址链接]](https://10why.net/) [[网址链接]](https://www.eetree.cn/wiki/eebaike) [[网址链接]](https://www.nongyie.com/) [[网址链接]](https://www.gkket.com/gkwk/) | 规则方式清洗与筛选,并经过人工验证。 |
| 中国大百科全书 | 1706 | [[网址链接]](https://www.zgbk.com/) | 人工撰写多样的prompt模板构造而成。 |
| wikiHow中文 | 1876 | [[网址链接]](https://zh.wikihow.com/首页)&[[公开数据集]](https://github.com/esbatmop/MNBVC/tree/main) | 规则方式清洗与筛选。 |
| **总量** | **4571** | - | - |
</details>
</details>
<details>
<summary><b>通用NLP任务</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| COIG-PC-Core | 3000 | [[Open Dataset]](https://huggingface.co/datasets/BAAI/COIG-PC-core) | 人工验证数据质量。 |
| **总量** | **3000** | - | - |
</details>
<details>
<summary><b>考试&试题</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| 高考&中考 | 2000 | [[公开数据集]](https://huggingface.co/datasets/BAAI/COIG) | - |
| 研究生入学考试 | 475 | 从网络中收集 | 规则方式清洗与筛选。 |
| 逻辑推理题 | 422 | 从网络中收集 | 规则方式清洗与筛选。 |
| **总量** | **2897** | - | - |
</details>
<details>
<summary><b>人类价值观</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| 100poison | 906 | [[公开数据集]](https://modelscope.cn/datasets/damo/100PoisonMpts/summary) | - |
| COIG-human-value | 101 | [[公开数据集]](https://huggingface.co/datasets/BAAI/COIG) | 经人工审核数据质量 |
| **总量** | **1007** | - | - |
</details>
<details>
<summary><b>中国传统文化</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| 中华传统文化试题 | 232 | 从网络中收集 | 规则方式清洗与筛选,并经过人工验证。 |
| 成语释义 | 112 | [[公开数据集]](https://huggingface.co/datasets/YeungNLP/firefly-train-1.1M) | 规则方式清洗与筛选,并经过人工验证。 |
| 古诗词撰写 | 47 | [[公开数据集]](https://huggingface.co/datasets/YeungNLP/firefly-train-1.1M) | 规则方式清洗与筛选,并经过人工验证。 |
| 文言文互译 | 112 | [[公开数据集]](https://huggingface.co/datasets/YeungNLP/firefly-train-1.1M) | 规则方式清洗与筛选,并经过人工验证。 |
| **总量** | **503** | - | - |
</details>
<details>
<summary><b>金融&经管领域</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| MBA百科 | 10689 | [[网址链接]](https://wiki.mbalib.com/wiki/首页) | 人工撰写多样的prompt模板构造而成。 |
| 金融NLP任务 | 600 | [[公开数据集]](https://huggingface.co/datasets/BAAI/COIG-PC) | 人工核验数据质量 |
| **总量** | **11289** | - | - |
</details>
<details>
<summary><b>医疗领域</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| 医疗百科 | 8351 | [[网址链接]](www.baikemy.com) | 人工撰写多样的prompt模板构造而成。 |
| 医疗文章 | 186 | [[网址链接]](https://51zyzy.com/article/list.html) [[网址链接]](https://baobao.baidu.com/dailyjnl/list/13.html) | 规则方式清洗与筛选。 |
| **总量** | **8537** | - | - |
</details>
<details>
<summary><b>法律领域</b></summary>
| 类别 | 数量 | 来源 | 构造方式 |
| ----------------- | -------- | ------ | --------------------------------------- |
| 法律研究生入学考试 | 2645 | 从网络中收集 | 规则方式清洗与筛选。 |
| **总量** | **2645** | - | - |
</details>
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
如果本项目为您的研究带来了帮助,欢迎引用!
```bibtex
@article{bai2024coig,
title={COIG-CQIA: Quality is All You Need for Chinese Instruction Fine-tuning},
author={Bai, Yuelin and Du, Xinrun and Liang, Yiming and Jin, Yonggang and Liu, Ziqiang and Zhou, Junting and Zheng, Tianyu and Zhang, Xincheng and Ma, Nuo and Wang, Zekun and others},
journal={arXiv preprint arXiv:2403.18058},
year={2024}
}
```
本数据集中也包含了以下公开数据:
```bibtex
@article{zhang2023chinese,
title={Chinese open instruction generalist: A preliminary release},
author={Zhang, Ge and Shi, Yemin and Liu, Ruibo and Yuan, Ruibin and Li, Yizhi and Dong, Siwei and Shu, Yu and Li, Zhaoqun and Wang, Zekun and Lin, Chenghua and others},
journal={arXiv preprint arXiv:2304.07987},
year={2023}
}
@misc{Firefly,
author = {Jianxin Yang},
title = {Firefly(流萤): 中文对话式大语言模型},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/yangjianxin1/Firefly}},
}
@misc{xu2023cvalues,
title={CValues: Measuring the Values of Chinese Large Language Models from Safety to Responsibility},
author={Guohai Xu and Jiayi Liu and Ming Yan and Haotian Xu and Jinghui Si and Zhuoran Zhou and Peng Yi and Xing Gao and Jitao Sang and Rong Zhang and Ji Zhang and Chao Peng and Fei Huang and Jingren Zhou},
year={2023},
eprint={2307.09705},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
liuyanchen1015/MULTI_VALUE_cola_regularized_reflexives | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1802
num_examples: 27
- name: test
num_bytes: 1878
num_examples: 25
- name: train
num_bytes: 11199
num_examples: 154
download_size: 12397
dataset_size: 14879
---
# Dataset Card for "MULTI_VALUE_cola_regularized_reflexives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anirudh2403/therapy-conversation-synthetic | ---
license: openrail
---
|
DavidFM43/gutenberg_spacy-ner-monitoring | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-CARDINAL
'2': I-CARDINAL
'3': B-PERSON
'4': I-PERSON
'5': B-TIME
'6': I-TIME
'7': B-WORK_OF_ART
'8': I-WORK_OF_ART
splits:
- name: train
num_bytes: 1697
num_examples: 1
- name: test
num_bytes: 1531
num_examples: 1
download_size: 5147
dataset_size: 3228
---
# Dataset Card for "gutenberg_spacy-ner-monitoring"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangyi617/AE_adversarial_train_prompt5 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 21150300.0
num_examples: 50
download_size: 21150529
dataset_size: 21150300.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MiguelAngeloCwb/dummy-issues-database | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: comments
dtype: int64
- name: created_at
dtype: string
- name: updated_at
dtype: string
- name: closed_at
dtype: string
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
splits:
- name: train
num_bytes: 16036629
num_examples: 5609
download_size: 3927676
dataset_size: 16036629
---
# Dataset Card for "dummy-issues-database"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MNC-LLM/squad_subset_100_p_first | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 102948
num_examples: 100
download_size: 64854
dataset_size: 102948
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
myrtotsok/ben_requests_dataset | ---
dataset_info:
features:
- name: request
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 18548
num_examples: 240
- name: validation
num_bytes: 4632
num_examples: 60
download_size: 8245
dataset_size: 23180
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
version-control/tf-1.0-1.13-oss-seed | ---
dataset_info:
features:
- name: seed
dtype: string
- name: seed_api
dtype: string
- name: index
dtype: int64
splits:
- name: train
num_bytes: 11391694
num_examples: 14766
download_size: 4936606
dataset_size: 11391694
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-54000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1023293
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
truthisneverlinear/eleventh-doctor-scripts | ---
language: en
tags:
- NLP
- conservation
- dialogue
---
# Doctor Who Dialogues
This dataset contains all the script lines of Eleventh Doctor from Doctor Who which is a popular TV series. It can be processed and used for chatbots or relevant stuff. |
xzuyn/tulu-uncensored | ---
language:
- en
tags:
- allenai
- tulu
- ehartford
- alpaca
size_categories:
- 100K<n<1M
---
[How Far Can Camels Go? Exploring the State of Instruction Tuning on Open Resources](https://arxiv.org/abs/2306.04751)
[Original dataset page from ehartford.](https://huggingface.co/datasets/ehartford/open-instruct-uncensored)
348,020 entries. Sourced from `open-instruct-uncensored.jsonl`. Uses only these dataset subsets;
1. Flan V2
2. CoT
3. Dolly
4. OASST1
5. GPT4-Alpaca
6. Code-Alpaca
7. ShareGPT
```
Count of each Dataset:
code_alpaca: 19991
oasst1: 49433
flan_v2: 97519
sharegpt: 46733
dolly: 14624
cot: 73946
gpt4_alpaca: 45774
``` |
kaitchup/opus-Finnish-to-English | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: validation
num_bytes: 249219
num_examples: 2000
- name: train
num_bytes: 86453966
num_examples: 962383
download_size: 65522411
dataset_size: 86703185
---
# Dataset Card for "opus-fi-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
daeell/embedding-test | ---
license: mit
language:
- en
- ko
--- |
CyberHarem/aoba_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of aoba/青葉 (Kantai Collection)
This is the dataset of aoba/青葉 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `ponytail, scrunchie, blue_eyes, purple_hair, blue_scrunchie, pink_hair, messy_hair, short_hair, breasts, hair_scrunchie`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 457.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 295.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1175 | 634.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 417.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1175 | 834.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aoba_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aoba_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_thighhighs, sailor_collar, serafuku, short_sleeves, solo, yellow_neckerchief, looking_at_viewer, shorts, simple_background, smile, white_background, ahoge, shirt, large_breasts |
| 1 | 36 |  |  |  |  |  | 1girl, serafuku, solo, simple_background, yellow_neckerchief, looking_at_viewer, upper_body, white_background, smile, purple_sailor_collar, short_sleeves, hair_ornament |
| 2 | 7 |  |  |  |  |  | 1girl, alternate_costume, full_body, looking_at_viewer, simple_background, sneakers, solo, standing, white_background, medium_breasts, black_shorts, grey_background, open_mouth, smile, t-shirt, white_shirt |
| 3 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, solo, white_background, blush, collarbone, blue_bikini, cleavage, hair_between_eyes, hair_ornament, large_breasts, medium_breasts, open_mouth, twitter_username, ahoge, front-tie_bikini_top, one-hour_drawing_challenge, side-tie_bikini_bottom, upper_body |
| 4 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, blue_sky, day, outdoors, solo, medium_breasts, ocean, cleavage, cloud, beach, blue_bikini, large_breasts, navel, smile |
| 5 | 9 |  |  |  |  |  | fake_animal_ears, playboy_bunny, rabbit_ears, 1girl, black_leotard, rabbit_tail, solo, strapless_leotard, alternate_costume, detached_collar, fake_tail, looking_at_viewer, black_pantyhose, medium_breasts, black_bowtie, cleavage, cowboy_shot, large_breasts, simple_background, wrist_cuffs |
| 6 | 6 |  |  |  |  |  | 1girl, smile, solo, alternate_costume, floral_print, looking_at_viewer, hair_ornament, obi, upper_body, blue_kimono, new_year |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | sailor_collar | serafuku | short_sleeves | solo | yellow_neckerchief | looking_at_viewer | shorts | simple_background | smile | white_background | ahoge | shirt | large_breasts | upper_body | purple_sailor_collar | hair_ornament | alternate_costume | full_body | sneakers | standing | medium_breasts | black_shorts | grey_background | open_mouth | t-shirt | white_shirt | blush | collarbone | blue_bikini | cleavage | hair_between_eyes | twitter_username | front-tie_bikini_top | one-hour_drawing_challenge | side-tie_bikini_bottom | blue_sky | day | outdoors | ocean | cloud | beach | navel | fake_animal_ears | playboy_bunny | rabbit_ears | black_leotard | rabbit_tail | strapless_leotard | detached_collar | fake_tail | black_pantyhose | black_bowtie | cowboy_shot | wrist_cuffs | floral_print | obi | blue_kimono | new_year |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:----------------|:-----------|:----------------|:-------|:---------------------|:--------------------|:---------|:--------------------|:--------|:-------------------|:--------|:--------|:----------------|:-------------|:-----------------------|:----------------|:--------------------|:------------|:-----------|:-----------|:-----------------|:---------------|:------------------|:-------------|:----------|:--------------|:--------|:-------------|:--------------|:-----------|:--------------------|:-------------------|:-----------------------|:-----------------------------|:-------------------------|:-----------|:------|:-----------|:--------|:--------|:--------|:--------|:-------------------|:----------------|:--------------|:----------------|:--------------|:--------------------|:------------------|:------------|:------------------|:---------------|:--------------|:--------------|:---------------|:------|:--------------|:-----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 36 |  |  |  |  |  | X | | | X | X | X | X | X | | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | | X | | X | | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | | X | | X | | X | | X | X | | X | X | | X | | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | | X | | X | | | X | | | | X | | | | | | | | X | | | | | | | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | | | X | | X | | X | | | | | X | | | | X | | | | X | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | | X | | X | | | X | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
|
open-llm-leaderboard/details_kalisai__Nusantara-1.8b-Indo-Chat | ---
pretty_name: Evaluation run of kalisai/Nusantara-1.8b-Indo-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kalisai/Nusantara-1.8b-Indo-Chat](https://huggingface.co/kalisai/Nusantara-1.8b-Indo-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kalisai__Nusantara-1.8b-Indo-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T22:20:55.643139](https://huggingface.co/datasets/open-llm-leaderboard/details_kalisai__Nusantara-1.8b-Indo-Chat/blob/main/results_2024-03-10T22-20-55.643139.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3062721042670704,\n\
\ \"acc_stderr\": 0.032674268588055555,\n \"acc_norm\": 0.30890953699867585,\n\
\ \"acc_norm_stderr\": 0.0334717817480905,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.37265340146580295,\n\
\ \"mc2_stderr\": 0.013950530613032723\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3122866894197952,\n \"acc_stderr\": 0.013542598541688064,\n\
\ \"acc_norm\": 0.3532423208191126,\n \"acc_norm_stderr\": 0.013967822714840055\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4302927703644692,\n\
\ \"acc_stderr\": 0.004941051795214794,\n \"acc_norm\": 0.5632344154550887,\n\
\ \"acc_norm_stderr\": 0.004949716368890496\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.038607315993160904,\n\
\ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.038607315993160904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3320754716981132,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.3320754716981132,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594316,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594316\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.02306818884826111,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02306818884826111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n\
\ \"acc_stderr\": 0.02528441611490016,\n \"acc_norm\": 0.2709677419354839,\n\
\ \"acc_norm_stderr\": 0.02528441611490016\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3939393939393939,\n \"acc_stderr\": 0.03481285338232963,\n \"\
acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.03481285338232963\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.034474782864143565,\n\
\ \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.034474782864143565\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n\
\ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02934457250063434,\n \
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02934457250063434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\
acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.38235294117647056,\n \"acc_stderr\": 0.03410785338904719,\n \"\
acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.03410785338904719\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3080168776371308,\n \"acc_stderr\": 0.03005238933560569,\n \
\ \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.03005238933560569\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.037709700493470194,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.037709700493470194\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4017094017094017,\n\
\ \"acc_stderr\": 0.03211693751051621,\n \"acc_norm\": 0.4017094017094017,\n\
\ \"acc_norm_stderr\": 0.03211693751051621\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3537675606641124,\n\
\ \"acc_stderr\": 0.017098184708161903,\n \"acc_norm\": 0.3537675606641124,\n\
\ \"acc_norm_stderr\": 0.017098184708161903\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577612,\n\
\ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260659,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260659\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.026004800363952113,\n\
\ \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.026004800363952113\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.02575586592263294,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.02575586592263294\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2808641975308642,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.2808641975308642,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432424,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432424\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2516297262059974,\n\
\ \"acc_stderr\": 0.011083276280441914,\n \"acc_norm\": 0.2516297262059974,\n\
\ \"acc_norm_stderr\": 0.011083276280441914\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687754,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687754\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.03096590312357304,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.03096590312357304\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683229,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683229\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.37265340146580295,\n\
\ \"mc2_stderr\": 0.013950530613032723\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5974743488555643,\n \"acc_stderr\": 0.013782866831703043\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03335860500379075,\n \
\ \"acc_stderr\": 0.004946282649173775\n }\n}\n```"
repo_url: https://huggingface.co/kalisai/Nusantara-1.8b-Indo-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|arc:challenge|25_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|gsm8k|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hellaswag|10_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T22-20-55.643139.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T22-20-55.643139.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- '**/details_harness|winogrande|5_2024-03-10T22-20-55.643139.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T22-20-55.643139.parquet'
- config_name: results
data_files:
- split: 2024_03_10T22_20_55.643139
path:
- results_2024-03-10T22-20-55.643139.parquet
- split: latest
path:
- results_2024-03-10T22-20-55.643139.parquet
---
# Dataset Card for Evaluation run of kalisai/Nusantara-1.8b-Indo-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kalisai/Nusantara-1.8b-Indo-Chat](https://huggingface.co/kalisai/Nusantara-1.8b-Indo-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kalisai__Nusantara-1.8b-Indo-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T22:20:55.643139](https://huggingface.co/datasets/open-llm-leaderboard/details_kalisai__Nusantara-1.8b-Indo-Chat/blob/main/results_2024-03-10T22-20-55.643139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3062721042670704,
"acc_stderr": 0.032674268588055555,
"acc_norm": 0.30890953699867585,
"acc_norm_stderr": 0.0334717817480905,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.37265340146580295,
"mc2_stderr": 0.013950530613032723
},
"harness|arc:challenge|25": {
"acc": 0.3122866894197952,
"acc_stderr": 0.013542598541688064,
"acc_norm": 0.3532423208191126,
"acc_norm_stderr": 0.013967822714840055
},
"harness|hellaswag|10": {
"acc": 0.4302927703644692,
"acc_stderr": 0.004941051795214794,
"acc_norm": 0.5632344154550887,
"acc_norm_stderr": 0.004949716368890496
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.038607315993160904,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.038607315993160904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3320754716981132,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.3320754716981132,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594316,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594316
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02306818884826111,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02306818884826111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.03481285338232963,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.03481285338232963
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.034474782864143565,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.034474782864143565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654824,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654824
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02934457250063434,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02934457250063434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.03410785338904719,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.03410785338904719
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3080168776371308,
"acc_stderr": 0.03005238933560569,
"acc_norm": 0.3080168776371308,
"acc_norm_stderr": 0.03005238933560569
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.037709700493470194,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.037709700493470194
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4017094017094017,
"acc_stderr": 0.03211693751051621,
"acc_norm": 0.4017094017094017,
"acc_norm_stderr": 0.03211693751051621
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3537675606641124,
"acc_stderr": 0.017098184708161903,
"acc_norm": 0.3537675606641124,
"acc_norm_stderr": 0.017098184708161903
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28034682080924855,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.28034682080924855,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260659,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260659
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.026004800363952113,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.026004800363952113
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.02575586592263294,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.02575586592263294
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2808641975308642,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.2808641975308642,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432424,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432424
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2516297262059974,
"acc_stderr": 0.011083276280441914,
"acc_norm": 0.2516297262059974,
"acc_norm_stderr": 0.011083276280441914
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687754,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687754
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.01774089950917779,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.01774089950917779
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910508,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.03096590312357304,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.03096590312357304
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683229,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683229
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.37265340146580295,
"mc2_stderr": 0.013950530613032723
},
"harness|winogrande|5": {
"acc": 0.5974743488555643,
"acc_stderr": 0.013782866831703043
},
"harness|gsm8k|5": {
"acc": 0.03335860500379075,
"acc_stderr": 0.004946282649173775
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
czyzi0/pwr-azon-speech-dataset | ---
license: cc-by-sa-4.0
task_categories:
- automatic-speech-recognition
language:
- pl
pretty_name: PWr AZON
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: transcript
dtype: string
- name: gender
dtype: string
- name: id
dtype: string
- name: id_og
dtype: string
splits:
- name: train
num_bytes: 8585221408.406
num_examples: 14491
- name: unsup
num_bytes: 1128648882
num_examples: 841
download_size: 9746452069
dataset_size: 9713870290.406
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: unsup
path: data/unsup-*
---
This speech dataset consists of 15332 short audio clips of multiple speakers speaking in Polish. Transcription is provided for 14491 audio clips (`train` split), and it is missing for 841 audio clips (`unsup` split). Gender of speaker is provided for the whole dataset. Clips have total length of almost 31 hours.
This dataset was created from _Korpus nagrań próbek mowy do celów budowy modeli akustycznych dla automatycznego rozpoznawania mowy w języku polskim_. The dataset was repackaged into easier to use format. If you are interested in the original data, please visit https://zasobynauki.pl/zasoby/korpus-nagran-probek-mowy-do-celow-budowy-modeli-akustycznych-dla-automatycznego-rozpoznawania-mowy,53293/
Also, if you find this resource helpful, kindly consider leaving a like. |
CyberHarem/suzunami_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of suzunami (Kantai Collection)
This is the dataset of suzunami (Kantai Collection), containing 74 images and their tags.
The core tags of this character are `green_hair, ribbon, hair_ribbon, multicolored_hair, bow, gradient_hair, medium_hair, aqua_bow, breasts, long_hair, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 63.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzunami_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 74 | 40.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzunami_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 150 | 82.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzunami_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 74 | 56.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzunami_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 150 | 109.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suzunami_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/suzunami_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, navel, solo, bikini, forehead, short_shorts, looking_at_viewer, open_mouth, smile, white_jacket, cleavage, collarbone, cowboy_shot, open_jacket, simple_background, black_shorts, medium_breasts, white_background, official_alternate_costume |
| 1 | 5 |  |  |  |  |  | 1girl, forehead, grey_pantyhose, halterneck, long_sleeves, pleated_dress, school_uniform, simple_background, solo, white_shirt, full_body, lace-up_boots, purple_dress, white_background, open_mouth, short_hair, standing, aqua_bowtie, blue_bowtie, chibi, grey_hair, smile |
| 2 | 11 |  |  |  |  |  | 1girl, forehead, halterneck, pleated_dress, purple_dress, school_uniform, white_shirt, long_sleeves, solo, grey_pantyhose, polka_dot_ribbon, one-hour_drawing_challenge, open_mouth, white_ribbon, aqua_bowtie, white_background, cowboy_shot, looking_at_viewer, smile, half_updo, simple_background |
| 3 | 6 |  |  |  |  |  | fake_animal_ears, playboy_bunny, rabbit_ears, rabbit_tail, strapless_leotard, wrist_cuffs, 1girl, detached_collar, forehead, grey_pantyhose, purple_leotard, solo, aqua_bowtie, fake_tail, fishnet_pantyhose, full_body, highleg_leotard, simple_background, small_breasts, thighband_pantyhose, white_background, adapted_costume, high_heels, medium_breasts, purple_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | navel | solo | bikini | forehead | short_shorts | looking_at_viewer | open_mouth | smile | white_jacket | cleavage | collarbone | cowboy_shot | open_jacket | simple_background | black_shorts | medium_breasts | white_background | official_alternate_costume | grey_pantyhose | halterneck | long_sleeves | pleated_dress | school_uniform | white_shirt | full_body | lace-up_boots | purple_dress | short_hair | standing | aqua_bowtie | blue_bowtie | chibi | grey_hair | polka_dot_ribbon | one-hour_drawing_challenge | white_ribbon | half_updo | fake_animal_ears | playboy_bunny | rabbit_ears | rabbit_tail | strapless_leotard | wrist_cuffs | detached_collar | purple_leotard | fake_tail | fishnet_pantyhose | highleg_leotard | small_breasts | thighband_pantyhose | adapted_costume | high_heels | purple_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:---------|:-----------|:---------------|:--------------------|:-------------|:--------|:---------------|:-----------|:-------------|:--------------|:--------------|:--------------------|:---------------|:-----------------|:-------------------|:-----------------------------|:-----------------|:-------------|:---------------|:----------------|:-----------------|:--------------|:------------|:----------------|:---------------|:-------------|:-----------|:--------------|:--------------|:--------|:------------|:-------------------|:-----------------------------|:---------------|:------------|:-------------------|:----------------|:--------------|:--------------|:--------------------|:--------------|:------------------|:-----------------|:------------|:--------------------|:------------------|:----------------|:----------------------|:------------------|:-------------|:------------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | | X | | | X | X | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | X | | X | | X | X | X | | | | X | | X | | | X | | X | X | X | X | X | X | | | X | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | X | | | | | | | | | | X | | X | X | | X | | | | | | X | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
DBQ/Burberry.Product.prices.Japan | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Japan - Burberry - Product-level price list
tags:
- webscraping
- ecommerce
- Burberry
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 985467
num_examples: 2950
download_size: 267744
dataset_size: 985467
---
# Burberry web scraped data
## About the website
The **Fashion Industry** in the **Asia Pacific**, particularly in **Japan**, has seen a significant growth in recent years. High-end, luxury brands like **Burberry** have established themselves firmly in the region. Its a fast-paced, highly consumer-driven industry that heavily incorporates the latest technology and trends. One noteworthy trend in Japans fashion industry is the rapid expansion of **Ecommerce platforms**. The dataset observed provides valuable insights into the **Ecommerce product-list page (PLP) data** on Burberrys operations in Japan, highlighting the online shopping preferences and buying behaviors of consumers in this unique and highly evolved marketplace.
## Link to **dataset**
[Japan - Burberry - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Burberry%20Product-prices%20Japan/r/recxtv3fyaKGgEGOj)
|
tyzhu/squad_first_sent_v4_train_30_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 111024
num_examples: 70
- name: validation
num_bytes: 11592
num_examples: 10
- name: eval_first_sent
num_bytes: 11592
num_examples: 10
download_size: 102146
dataset_size: 134208
---
# Dataset Card for "squad_first_sent_v4_train_30_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tmnam20/test-dedup | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 395
num_examples: 4
download_size: 0
dataset_size: 395
---
# Dataset Card for "test-dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FischlVonLuftschlossNarfidort/sample-genshin-character | ---
license: unknown
---
|
arjun2183/train-1k | ---
dataset_info:
features:
- name: Context
dtype: string
- name: Response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1296987
num_examples: 1000
download_size: 652915
dataset_size: 1296987
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hlt-lab/mutualsample-repeat_last_speaker | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: reference
dtype: string
splits:
- name: train
num_bytes: 49856
num_examples: 100
download_size: 38655
dataset_size: 49856
---
# Dataset Card for "mutualsample-repeat_last_speaker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Asap7772/relabeled_alpacafarm_pythiasft_20K_preference_data_maxlength | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: output
dtype: string
- name: text
dtype: string
- name: alpaca_text
dtype: string
- name: prompt
dtype: string
- name: alpaca_prompt
dtype: string
- name: y_ref
dtype: string
- name: y_1
dtype: string
- name: y_2
dtype: string
- name: y_w
dtype: string
- name: y_w_alpaca
dtype: string
- name: y_l
dtype: string
- name: y_l_alpaca
dtype: string
- name: y_w_score
dtype: float64
- name: y_l_score
dtype: float64
- name: score_diff
dtype: float64
splits:
- name: train
num_bytes: 177945579
num_examples: 19000
- name: test
num_bytes: 9378616
num_examples: 1000
download_size: 86089134
dataset_size: 187324195
---
# Dataset Card for "relabeled_alpacafarm_pythiasft_20K_preference_data_maxlength"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sensationalspace/sensarisk | ---
license: mit
---
|
Nikutka/L1_poleval_korpus_pelny | ---
dataset_info:
features:
- name: content
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 764265
num_examples: 9443
- name: test
num_bytes: 71297
num_examples: 891
download_size: 556613
dataset_size: 835562
---
# Dataset Card for "L1_poleval_korpus_pelny"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jotschi/visual_genome-simple-en | ---
language:
- en
license_name: cc-by-4.0
license_link: https://creativecommons.org/licenses/by/4.0/legalcode
tags:
- visual_genome
- simple-english
annotations_creators:
- machine-generated
pretty_name: Visual Genome in Simple English
size_categories:
- n<820k
source_datasets:
- visual_genome
task_categories:
- text-generation
- image-to-text
- text-to-image
---
# Dataset Card for Visual Genome Annotations in Simple English
This dataset contains captions that were rephrased into simple english so that a young child would understand it.
## Dataset Details
### Dataset Description
- **Curated by:** {{ curators | default("[More Information Needed]", true)}}
- **Language(s) (NLP):** {{ language | default("[More Information Needed]", true)}}
- **License:** {{ license | default("[More Information Needed]", true)}}
### Dataset Sources
The processed [Visual Genome](https://homes.cs.washington.edu/~ranjay/visualgenome/index.html) captions in this repo are based on the following sources:
* 941425b651f50cdb1a6f0673eaab6260 vg_caption.json (https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/visual_genome/vg_caption.json)
Visual Genome:
- **Download:** https://homes.cs.washington.edu/~ranjay/visualgenome/index.html
- **Paper:** https://link.springer.com/article/10.1007/s11263-016-0981-7
## Dataset Creation
This dataset was generated by processing the annotations via [Mistal7B](https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-AWQ).
Prompt used:
```
Rewrite the sentence " + caption + " for a 3 to 4 year old child. Give only one simple sentence. Don't use the word see. Give only a single answer.
```
A filter was applied to only store captions which matched the common output format. A best effort filter was applied to reduce the chance of including multiple example sentences in the output.
### Curation Rationale
This dataset is useful for experiments with small LLMs which have only a reduced corpus. The dataset is suitable to be used for LAVIS experiments (QFormer Training) with a finetuned TinyStories 33M LLM.
|
farsi_news | ---
annotations_creators:
- found
language_creators:
- found
language:
- fa
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: null
pretty_name: FarsiNews
dataset_info:
features:
- name: title
dtype: string
- name: summary
dtype: string
- name: link
dtype: string
- name: tags
sequence: string
splits:
- name: hamshahri
num_bytes: 1267659
num_examples: 2203
- name: radiofarda
num_bytes: 265272
num_examples: 284
download_size: 1648337
dataset_size: 1532931
---
# Dataset Card for FarsiNews
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** []()
- **Repository:** [link](https://github.com/sci2lab/Farsi-datasets)
- **Paper:** []()
- **Leaderboard:** []()
- **Point of Contact:** []()
### Dataset Summary
https://github.com/sci2lab/Farsi-datasets
Contains Farsi (Persian) datasets for Machine Learning tasks, particularly NLP.
These datasets have been extracted from the RSS feed of two Farsi news agency websites:
- Hamshahri
- RadioFarda
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
[More Information Needed]
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
https://github.com/sci2lab/Farsi-datasets
### Contributions
Thanks to [@Narsil](https://github.com/Narsil) for adding this dataset. |
Mahziar/Meta-Movies | ---
license: mit
---
|
arbml/wiki_lingua_ar | ---
dataset_info:
features:
- name: article
dtype: string
- name: summary
dtype: string
splits:
- name: test
num_bytes: 22744300
num_examples: 5841
- name: train
num_bytes: 79113081
num_examples: 20441
- name: validation
num_bytes: 11620265
num_examples: 2919
download_size: 55826192
dataset_size: 113477646
---
# Dataset Card for "wiki_lingua_ar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tsuinzues/maui | ---
license: openrail
---
|
Eloquent/TopicalQuiz | ---
license: cc-by-sa-4.0
language:
- en
pretty_name: ELOQUENT Topical Quiz task items
---
These datasets are the sample and test items for the 2024 ELOQUENT lab for evaluating the quality of generative language models. More information on the lab page at https://eloquent-lab.github.io/ |
arieg/bw_spec_cls_80_32 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '72786'
'1': '72787'
'2': '72788'
'3': '72789'
'4': '72790'
'5': '72926'
'6': '72927'
'7': '72928'
'8': '72930'
'9': '73099'
'10': '73100'
'11': '73123'
'12': '73124'
'13': '73125'
'14': '73169'
'15': '73170'
'16': '73171'
'17': '73172'
'18': '73174'
'19': '73175'
'20': '73192'
'21': '73193'
'22': '73306'
'23': '73309'
'24': '73318'
'25': '73335'
'26': '73340'
'27': '73341'
'28': '73342'
'29': '73343'
'30': '73344'
'31': '73363'
'32': '73365'
'33': '73366'
'34': '73367'
'35': '73368'
'36': '73369'
'37': '73370'
'38': '73371'
'39': '73372'
'40': '73465'
'41': '73466'
'42': '73467'
'43': '73468'
'44': '73469'
'45': '73486'
'46': '73495'
'47': '73550'
'48': '73551'
'49': '73566'
'50': '73568'
'51': '73572'
'52': '73573'
'53': '73580'
'54': '73584'
'55': '73585'
'56': '73587'
'57': '73658'
'58': '73675'
'59': '73760'
'60': '73761'
'61': '73762'
'62': '73764'
'63': '73765'
'64': '73766'
'65': '73767'
'66': '73768'
'67': '73769'
'68': '73770'
'69': '73771'
'70': '73772'
'71': '73774'
'72': '73778'
'73': '73792'
'74': '73797'
'75': '73819'
'76': '73820'
'77': '73821'
'78': '73822'
'79': '73921'
splits:
- name: train
num_bytes: 85147582.4
num_examples: 1600
- name: test
num_bytes: 21417107.0
num_examples: 400
download_size: 107224330
dataset_size: 106564689.4
---
# Dataset Card for "bw_spec_cls_80_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamAndCheese82/mathocr-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: material_type
dtype: string
- name: latex
sequence: string
splits:
- name: train
num_bytes: 5475569186.412
num_examples: 237811
- name: validation
num_bytes: 234431735.696
num_examples: 20873
- name: test
num_bytes: 192718790.489
num_examples: 17369
download_size: 5401809531
dataset_size: 5902719712.597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.