datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
olm/olm-wikipedia-20220920 | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: OLM September 2022 Wikipedia
size_categories:
- 1M<n<10M
source_datasets: []
tags:
- pretraining
- language modelling
- wikipedia
- web
task_categories: []
task_ids: []
---
# Dataset Card for OLM September 2022 Wikipedia
Pretraining dataset, created with the OLM repo [here](https://github.com/huggingface/olm-datasets) from a September 2022 Wikipedia snapshot. |
mtkinit/MT-sentiment-dataset | ---
pretty_name: MT-sentiment-dataset
---
# MT-sentiment-dataset
Created from AIOD platform |
fathyshalab/rk1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 544223.8520625889
num_examples: 1124
- name: test
num_bytes: 136540.1479374111
num_examples: 282
download_size: 420690
dataset_size: 680764.0
---
# Dataset Card for "rk1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/11352_Videos_Surveillance_Video_Data_from_Overhead_View | ---
license: cc-by-nc-nd-4.0
---
## Description
11,352 Videos- Surveillance Video Data from Overhead View. The data diversity includes multiple scenes, different time periods. The data can be used for tasks such as human detection, human tracking, and human body attribute analysis and other computer vision task based on surveillance video.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1153?source=Huggingface
# Specifications
## Data size
11,352 videos, the total duration is 44 hours 1 minutes 10.08 seconds
## Collecting environment
indoor scenes
## Data diversity
multiple scenes, different time periods
## Device
surveillance cameras with resolution of 1,920*1,080
## Collecting angles
looking down angle from overhead view
## Data format
.mp4
## Collecting content
collecting videos in different scenes
# Licensing Information
Commercial License
|
guyhadad01/Judaism-Hebrew-tok | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 1449785076
num_examples: 2809661
download_size: 762459709
dataset_size: 1449785076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tyzhu/lmind_nq_train1000_eval500_v1_doc | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 115608
num_examples: 1000
- name: train_recite_qa
num_bytes: 755758
num_examples: 1000
- name: eval_qa
num_bytes: 58285
num_examples: 500
- name: eval_recite_qa
num_bytes: 377880
num_examples: 500
- name: all_docs
num_bytes: 950316
num_examples: 1462
- name: all_docs_eval
num_bytes: 950216
num_examples: 1462
- name: train
num_bytes: 950316
num_examples: 1462
- name: validation
num_bytes: 950316
num_examples: 1462
download_size: 3216664
dataset_size: 5108695
---
# Dataset Card for "lmind_nq_train1000_eval500_v1_doc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
A-Bar/vi-ar_top_cs_dev | ---
dataset_info:
features:
- name: query
dtype: string
- name: passage
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 55656630
num_examples: 100000
download_size: 19350377
dataset_size: 55656630
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vi-ar_top_cs_dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AndyLiu0104/Soldering-Data-Tiny-More-Data | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 17693780.125
num_examples: 10463
download_size: 11511188
dataset_size: 17693780.125
---
# Dataset Card for "Soldering-Data-Tiny-More-Data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_that_resultative_past_participle | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: train
num_bytes: 399
num_examples: 3
download_size: 2435
dataset_size: 399
---
# Dataset Card for "MULTI_VALUE_sst2_that_resultative_past_participle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
duyhngoc/OV_Text | ---
annotations_creators:
- no-annotation
language:
- vi
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: OV_Text
size_categories:
- 10K<n<100K
task_categories:
- text-generation
---
# Dataset Card for OV_Text
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
The OV_Text dataset is a collection of 100,000 sentences sourced from various news articles.
Out of the 10,000 sentences in the dataset, 5,000 sentences have a length ranging from 50 to 150, while the other 5,000 sentences have a length ranging from 20 to 50. This distribution of sentence lengths provides a diverse range of text samples that can be used to train and test natural language processing models.
### Dataset Summary
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
| name | train | validation | test |
|---------|--------:|-----------:|-------:|
| small | 1600 | 200 | 200 |
| base | 8000 | 1000 | 1000 |
| large | 95000 | 2500 | 2500 |
## Dataset Creation
### Curation Rationale
### Source Data
### Annotations
## Additional Information
### Licensing Information
The dataset is released under Apache 2.0.
### Citation Information
### Contributions
|
presencesw/dataset_2000_decompese_question_2 | ---
dataset_info:
features:
- name: entities
sequence: 'null'
- name: triplets
list:
- name: question
dtype: string
- name: answer
dtype: string
- name: complex_question
dtype: string
splits:
- name: train
num_bytes: 69178
num_examples: 199
download_size: 26387
dataset_size: 69178
---
# Dataset Card for "dataset_2000_decompese_question_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/shiranui_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shiranui/不知火/不知火 (Azur Lane)
This is the dataset of shiranui/不知火/不知火 (Azur Lane), containing 62 images and their tags.
The core tags of this character are `animal_ears, black_hair, bangs, red_eyes, hair_over_one_eye, rabbit_ears, short_hair, diagonal_bangs, half_updo, fake_animal_ears, hair_ornament, ribbon, blunt_bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 62 | 43.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiranui_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 62 | 33.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiranui_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 124 | 62.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiranui_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 62 | 40.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiranui_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 124 | 74.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiranui_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shiranui_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, black_kimono, long_sleeves, solo, looking_at_viewer, upper_body, obiage, obijime, red_choker, wide_sleeves, blush, jitome, simple_background, empty_eyes, closed_mouth, collarbone, hitodama, open_mouth, white_background, retrofit_(azur_lane) |
| 1 | 6 |  |  |  |  |  | black_kimono, blush, closed_mouth, full_body, long_sleeves, tabi, wide_sleeves, 1girl, obiage, obijime, platform_footwear, simple_background, solo, white_background, expressionless, flame_print, geta, looking_at_viewer, red_choker, rigging, sitting, turret, eyes_visible_through_hair, hairband, hitodama, machinery, white_socks |
| 2 | 5 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, solo, wide_sleeves, black_kimono, simple_background, blush, brown_hair, hitodama, holding, upper_body, white_background, hair_bow, hairclip, machinery, red_choker, turret |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_kimono | long_sleeves | solo | looking_at_viewer | upper_body | obiage | obijime | red_choker | wide_sleeves | blush | jitome | simple_background | empty_eyes | closed_mouth | collarbone | hitodama | open_mouth | white_background | retrofit_(azur_lane) | full_body | tabi | platform_footwear | expressionless | flame_print | geta | rigging | sitting | turret | eyes_visible_through_hair | hairband | machinery | white_socks | brown_hair | holding | hair_bow | hairclip |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:-------|:--------------------|:-------------|:---------|:----------|:-------------|:---------------|:--------|:---------|:--------------------|:-------------|:---------------|:-------------|:-----------|:-------------|:-------------------|:-----------------------|:------------|:-------|:--------------------|:-----------------|:--------------|:-------|:----------|:----------|:---------|:----------------------------|:-----------|:------------|:--------------|:-------------|:----------|:-----------|:-----------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | | X | | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | X | | X | | | | X | | X | | | | | | | | | | X | | | X | | X | X | X | X |
|
chrxiao/legal_ambiguity_identification | ---
license: cc-by-nc-sa-4.0
language:
- en
tags:
- legal
size_categories:
- n<1K
configs:
- config_name: sara
data_files: sara_annotated.csv
- config_name: echr
data_files: echr_annotated.csv
---
## Dataset Summary
This is a dataset for the novel legal ambiguity identification task,
adapting prior [SARA](https://doi.org/10.48550/arXiv.2005.05257) and [ECHR](https://doi.org/10.48550/arXiv.2005.05257) datasets
with annotations on the existence of legal ambiguity in the application of general statutes to specific fact patterns.
This dataset is created through a senior thesis project; please reference this work (link TBD) for more information.
## Dataset Contact
Christina Xiao (xiao.christina@gmail.com)
(citation TBD) |
Jiahuan/teach_object | ---
dataset_info:
features:
- name: doc_id
dtype: string
- name: start_time
dtype: string
- name: query
dtype: string
- name: action
dtype: string
- name: action_success
dtype: string
- name: object
dtype: string
splits:
- name: train
num_bytes: 4365626
num_examples: 32487
- name: validation
num_bytes: 551083
num_examples: 4139
- name: test
num_bytes: 1799637
num_examples: 13738
download_size: 1088565
dataset_size: 6716346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
hatoum/mechanics.stackexchange | ---
license: cc-by-sa-4.0
---
|
sieecc/SOKI | ---
license: other
---
|
patrick65536/mandala_controlnet | ---
license: apache-2.0
dataset_info:
features:
- name: original_image
dtype: image
- name: condtioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 12212803.0
num_examples: 10
download_size: 0
dataset_size: 12212803.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vidhikatkoria/DA_SGD_Music | ---
dataset_info:
features:
- name: domain
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: act
dtype: int64
- name: speaker
dtype: int64
splits:
- name: train
num_bytes: 628520.2361015786
num_examples: 2913
- name: test
num_bytes: 149
num_examples: 1
download_size: 241847
dataset_size: 628669.2361015786
---
# Dataset Card for "DA_SGD_Music"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_baseline_v5_full_recite_full_passage | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4369231
num_examples: 2385
- name: validation
num_bytes: 573308
num_examples: 300
download_size: 1012407
dataset_size: 4942539
---
# Dataset Card for "squad_qa_baseline_v5_full_recite_full_passage"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrewsiah/se_cooking_preference_sft | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 11095448
num_examples: 7262
download_size: 6879361
dataset_size: 11095448
---
# Dataset Card for "se_cooking_preference_sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexwan0/wikipedia-foods | ---
dataset_info:
features:
- name: pageid
dtype: int64
- name: title
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: images_all
sequence: string
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 17943607118.0
num_examples: 59048
- name: validation
num_bytes: 3447356789.5
num_examples: 11684
download_size: 17024438254
dataset_size: 21390963907.5
---
# Dataset Card for "wikipedia-foods"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgyd2021/music_comment | ---
license: apache-2.0
language:
- zh
tags:
- music
size_categories:
- 100M<n<1B
---
## 49万港台内地歌曲信息
数据来源于 [QQMusicSpider](https://github.com/yangjianxin1/QQMusicSpider).
数据可用于:
* 根据歌手创作歌词.
* 根据歌名创作歌词.
* 根据歌名写评论.
|
KayoSilva88777/AlanJackson | ---
license: openrail
---
|
josedonoso/apples-dataset-v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2704421.0
num_examples: 192
- name: test
num_bytes: 646648.0
num_examples: 48
download_size: 3236890
dataset_size: 3351069.0
---
# Dataset Card for "apples-dataset-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pavlichenko/WizardLM_evol_instruct_70k_train_val_split | ---
task_categories:
- conversational
size_categories:
- 10K<n<100K
--- |
DanielPFlorian/Transformers-Github-Issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: float64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: comments
sequence: string
- name: created_at
dtype: int64
- name: updated_at
dtype: int64
- name: closed_at
dtype: int64
- name: author_association
dtype: string
- name: active_lock_reason
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: int64
splits:
- name: train
num_bytes: 157397577
num_examples: 28908
download_size: 49674263
dataset_size: 157397577
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: unknown
task_categories:
- text-classification
- text-retrieval
language:
- en
tags:
- Github
- Transformers
- github-issues
- huggingface
pretty_name: Transformers-Github-Issues
size_categories:
- 10K<n<100K
--- |
autoevaluate/autoeval-staging-eval-project-083d71a4-50b6-4074-aa7d-a46eddb83f06-42 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: autoevaluate/multi-class-classification
metrics: ['matthews_correlation']
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: autoevaluate/multi-class-classification
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
juege/agemo | ---
license: openrail
---
|
xiwen426/Keysight_Dataset | ---
task_categories:
- text-generation
language:
- en
--- |
zxying/findsum-5k | ---
license: odc-by
---
|
open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B | ---
pretty_name: Evaluation run of project-baize/baize-healthcare-lora-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [project-baize/baize-healthcare-lora-7B](https://huggingface.co/project-baize/baize-healthcare-lora-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T12:07:23.383581](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B/blob/main/results_2023-10-17T12-07-23.383581.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.00034761798968570957,\n \"f1\": 0.05929215604026857,\n\
\ \"f1_stderr\": 0.0013287960656248844,\n \"acc\": 0.3862326042845355,\n\
\ \"acc_stderr\": 0.009073496352009793\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968570957,\n\
\ \"f1\": 0.05929215604026857,\n \"f1_stderr\": 0.0013287960656248844\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04397270659590599,\n \
\ \"acc_stderr\": 0.00564766644912646\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893126\n\
\ }\n}\n```"
repo_url: https://huggingface.co/project-baize/baize-healthcare-lora-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T20_51_20.232990
path:
- '**/details_harness|drop|3_2023-10-16T20-51-20.232990.parquet'
- split: 2023_10_17T12_07_23.383581
path:
- '**/details_harness|drop|3_2023-10-17T12-07-23.383581.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T12-07-23.383581.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T20_51_20.232990
path:
- '**/details_harness|gsm8k|5_2023-10-16T20-51-20.232990.parquet'
- split: 2023_10_17T12_07_23.383581
path:
- '**/details_harness|gsm8k|5_2023-10-17T12-07-23.383581.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T12-07-23.383581.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T17_11_44.232250
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:11:44.232250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T17:11:44.232250.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T20_51_20.232990
path:
- '**/details_harness|winogrande|5_2023-10-16T20-51-20.232990.parquet'
- split: 2023_10_17T12_07_23.383581
path:
- '**/details_harness|winogrande|5_2023-10-17T12-07-23.383581.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T12-07-23.383581.parquet'
- config_name: results
data_files:
- split: 2023_10_16T20_51_20.232990
path:
- results_2023-10-16T20-51-20.232990.parquet
- split: 2023_10_17T12_07_23.383581
path:
- results_2023-10-17T12-07-23.383581.parquet
- split: latest
path:
- results_2023-10-17T12-07-23.383581.parquet
---
# Dataset Card for Evaluation run of project-baize/baize-healthcare-lora-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/project-baize/baize-healthcare-lora-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [project-baize/baize-healthcare-lora-7B](https://huggingface.co/project-baize/baize-healthcare-lora-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T12:07:23.383581](https://huggingface.co/datasets/open-llm-leaderboard/details_project-baize__baize-healthcare-lora-7B/blob/main/results_2023-10-17T12-07-23.383581.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968570957,
"f1": 0.05929215604026857,
"f1_stderr": 0.0013287960656248844,
"acc": 0.3862326042845355,
"acc_stderr": 0.009073496352009793
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968570957,
"f1": 0.05929215604026857,
"f1_stderr": 0.0013287960656248844
},
"harness|gsm8k|5": {
"acc": 0.04397270659590599,
"acc_stderr": 0.00564766644912646
},
"harness|winogrande|5": {
"acc": 0.728492501973165,
"acc_stderr": 0.012499326254893126
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b | ---
pretty_name: Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-research-oasst1-llama-65b](https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-17T22:10:29.981773](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b/blob/main/results_2023-08-17T22%3A10%3A29.981773.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6359037673839993,\n\
\ \"acc_stderr\": 0.0329346816196445,\n \"acc_norm\": 0.6396809356138717,\n\
\ \"acc_norm_stderr\": 0.03290965482744071,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.48845185520886875,\n\
\ \"mc2_stderr\": 0.014057830912491135\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6177474402730375,\n \"acc_stderr\": 0.014200454049979275,\n\
\ \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.01396014260059868\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6664011153156741,\n\
\ \"acc_stderr\": 0.004705347137699622,\n \"acc_norm\": 0.8593905596494722,\n\
\ \"acc_norm_stderr\": 0.0034690778470563765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283648,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283648\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101737,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830513,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n\
\ \"acc_stderr\": 0.016197807956848043,\n \"acc_norm\": 0.8275229357798165,\n\
\ \"acc_norm_stderr\": 0.016197807956848043\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n\
\ \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931055,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931055\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4770949720670391,\n\
\ \"acc_stderr\": 0.016704945740326188,\n \"acc_norm\": 0.4770949720670391,\n\
\ \"acc_norm_stderr\": 0.016704945740326188\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906497,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906497\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.0247238615047717,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.0247238615047717\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904212,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904212\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4830508474576271,\n\
\ \"acc_stderr\": 0.01276289688921086,\n \"acc_norm\": 0.4830508474576271,\n\
\ \"acc_norm_stderr\": 0.01276289688921086\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242304,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242304\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.48845185520886875,\n\
\ \"mc2_stderr\": 0.014057830912491135\n }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|arc:challenge|25_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hellaswag|10_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:53:50.635044.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T22:10:29.981773.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T17:53:50.635044.parquet'
- split: 2023_08_17T22_10_29.981773
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:10:29.981773.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T22:10:29.981773.parquet'
- config_name: results
data_files:
- split: 2023_08_17T17_53_50.635044
path:
- results_2023-08-17T17:53:50.635044.parquet
- split: 2023_08_17T22_10_29.981773
path:
- results_2023-08-17T22:10:29.981773.parquet
- split: latest
path:
- results_2023-08-17T22:10:29.981773.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-research-oasst1-llama-65b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-research-oasst1-llama-65b](https://huggingface.co/h2oai/h2ogpt-research-oasst1-llama-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-17T22:10:29.981773](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-research-oasst1-llama-65b/blob/main/results_2023-08-17T22%3A10%3A29.981773.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6359037673839993,
"acc_stderr": 0.0329346816196445,
"acc_norm": 0.6396809356138717,
"acc_norm_stderr": 0.03290965482744071,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.48845185520886875,
"mc2_stderr": 0.014057830912491135
},
"harness|arc:challenge|25": {
"acc": 0.6177474402730375,
"acc_stderr": 0.014200454049979275,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.01396014260059868
},
"harness|hellaswag|10": {
"acc": 0.6664011153156741,
"acc_stderr": 0.004705347137699622,
"acc_norm": 0.8593905596494722,
"acc_norm_stderr": 0.0034690778470563765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283648,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283648
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830513,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931055,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931055
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876166,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4770949720670391,
"acc_stderr": 0.016704945740326188,
"acc_norm": 0.4770949720670391,
"acc_norm_stderr": 0.016704945740326188
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906497,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.0247238615047717,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.0247238615047717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4830508474576271,
"acc_stderr": 0.01276289688921086,
"acc_norm": 0.4830508474576271,
"acc_norm_stderr": 0.01276289688921086
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242304,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242304
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.48845185520886875,
"mc2_stderr": 0.014057830912491135
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
NbAiLab/NST | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-183be059-9075194 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: dslim/bert-base-NER
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: dslim/bert-base-NER
* Dataset: conll2003
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@aseifert](https://huggingface.co/aseifert) for evaluating this model. |
open-llm-leaderboard/details_JaeyeonKang__CCK_Gony_v3.3 | ---
pretty_name: Evaluation run of JaeyeonKang/CCK_Gony_v3.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JaeyeonKang/CCK_Gony_v3.3](https://huggingface.co/JaeyeonKang/CCK_Gony_v3.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Gony_v3.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T14:16:13.968146](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Gony_v3.3/blob/main/results_2024-02-02T14-16-13.968146.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7127245389126519,\n\
\ \"acc_stderr\": 0.030324827741987404,\n \"acc_norm\": 0.7169197048023546,\n\
\ \"acc_norm_stderr\": 0.030907560438510125,\n \"mc1\": 0.5250917992656059,\n\
\ \"mc1_stderr\": 0.017481446804104017,\n \"mc2\": 0.6741286843212236,\n\
\ \"mc2_stderr\": 0.015021442699186793\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.674061433447099,\n \"acc_stderr\": 0.013697432466693242,\n\
\ \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.013340916085246252\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6921927902808206,\n\
\ \"acc_stderr\": 0.004606429684604527,\n \"acc_norm\": 0.8788090021907986,\n\
\ \"acc_norm_stderr\": 0.0032568214188573178\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.032790004063100495,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.032790004063100495\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.0254478638251086,\n\
\ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.0254478638251086\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.03068302084323101,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.03068302084323101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8419354838709677,\n \"acc_stderr\": 0.020752831511875278,\n \"\
acc_norm\": 0.8419354838709677,\n \"acc_norm_stderr\": 0.020752831511875278\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6009852216748769,\n \"acc_stderr\": 0.034454876862647144,\n \"\
acc_norm\": 0.6009852216748769,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246572,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.026265024608275882,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.026265024608275882\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"\
acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025045,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.02309432958259569,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.02309432958259569\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\
\ \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n\
\ \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.034465133507525975,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.034465133507525975\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.01789378490401853,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.01789378490401853\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8786717752234994,\n\
\ \"acc_stderr\": 0.011675913883906723,\n \"acc_norm\": 0.8786717752234994,\n\
\ \"acc_norm_stderr\": 0.011675913883906723\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252552,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252552\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n\
\ \"acc_stderr\": 0.016653875777524006,\n \"acc_norm\": 0.4547486033519553,\n\
\ \"acc_norm_stderr\": 0.016653875777524006\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.0225893188881767,\n\
\ \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.0225893188881767\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n\
\ \"acc_stderr\": 0.02255244778047802,\n \"acc_norm\": 0.8038585209003215,\n\
\ \"acc_norm_stderr\": 0.02255244778047802\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157365,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5430247718383312,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.5430247718383312,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.024231013370541087,\n\
\ \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.024231013370541087\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7761437908496732,\n \"acc_stderr\": 0.016863008585416613,\n \
\ \"acc_norm\": 0.7761437908496732,\n \"acc_norm_stderr\": 0.016863008585416613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366145,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5250917992656059,\n\
\ \"mc1_stderr\": 0.017481446804104017,\n \"mc2\": 0.6741286843212236,\n\
\ \"mc2_stderr\": 0.015021442699186793\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5822592873388931,\n \
\ \"acc_stderr\": 0.013584820638504823\n }\n}\n```"
repo_url: https://huggingface.co/JaeyeonKang/CCK_Gony_v3.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|arc:challenge|25_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|gsm8k|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hellaswag|10_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T14-16-13.968146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T14-16-13.968146.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- '**/details_harness|winogrande|5_2024-02-02T14-16-13.968146.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T14-16-13.968146.parquet'
- config_name: results
data_files:
- split: 2024_02_02T14_16_13.968146
path:
- results_2024-02-02T14-16-13.968146.parquet
- split: latest
path:
- results_2024-02-02T14-16-13.968146.parquet
---
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Gony_v3.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Gony_v3.3](https://huggingface.co/JaeyeonKang/CCK_Gony_v3.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Gony_v3.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T14:16:13.968146](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Gony_v3.3/blob/main/results_2024-02-02T14-16-13.968146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7127245389126519,
"acc_stderr": 0.030324827741987404,
"acc_norm": 0.7169197048023546,
"acc_norm_stderr": 0.030907560438510125,
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104017,
"mc2": 0.6741286843212236,
"mc2_stderr": 0.015021442699186793
},
"harness|arc:challenge|25": {
"acc": 0.674061433447099,
"acc_stderr": 0.013697432466693242,
"acc_norm": 0.7039249146757679,
"acc_norm_stderr": 0.013340916085246252
},
"harness|hellaswag|10": {
"acc": 0.6921927902808206,
"acc_stderr": 0.004606429684604527,
"acc_norm": 0.8788090021907986,
"acc_norm_stderr": 0.0032568214188573178
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.032790004063100495,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.032790004063100495
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.0254478638251086,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.0254478638251086
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.03068302084323101,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.03068302084323101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8419354838709677,
"acc_stderr": 0.020752831511875278,
"acc_norm": 0.8419354838709677,
"acc_norm_stderr": 0.020752831511875278
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6009852216748769,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.6009852216748769,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246572,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025045,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.02309432958259569,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.02309432958259569
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929203,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.034465133507525975,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.034465133507525975
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.01789378490401853,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.01789378490401853
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8786717752234994,
"acc_stderr": 0.011675913883906723,
"acc_norm": 0.8786717752234994,
"acc_norm_stderr": 0.011675913883906723
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252552,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252552
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524006,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524006
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.0225893188881767,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.0225893188881767
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.02255244778047802,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.02255244778047802
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157365,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5430247718383312,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.5430247718383312,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.024231013370541087,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.024231013370541087
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7761437908496732,
"acc_stderr": 0.016863008585416613,
"acc_norm": 0.7761437908496732,
"acc_norm_stderr": 0.016863008585416613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366145,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104017,
"mc2": 0.6741286843212236,
"mc2_stderr": 0.015021442699186793
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.5822592873388931,
"acc_stderr": 0.013584820638504823
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
neph1/stable-diffusion-prompt-pairs | ---
license: apache-2.0
---
Work in progress. A dataset for creating image generation tags from natural language descriptions.
Uses https://huggingface.co/Gustavosta/MagicPrompt-Stable-Diffusion for tags. Descriptions generated by chronos-hermes-13b-v2.
Please note that the dataset is generated in two batches, with different system prompts. The first is ~2000 rows. The second ~1000 rows. |
mideind/icelandic-english-translation | ---
license: cc-by-4.0
---
|
Chaymaa/grdf-v0 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 799001.8
num_examples: 22
- name: test
num_bytes: 79971.8
num_examples: 2
- name: valid
num_bytes: 35420.4
num_examples: 1
download_size: 917831
dataset_size: 914394.0000000001
---
# Dataset Card for "grdf-v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mudassar93/data_piano | ---
dataset_info:
features:
- name: response
dtype: string
- name: instruction
dtype: string
- name: chat
dtype: string
splits:
- name: train
num_bytes: 1080219
num_examples: 1823
download_size: 238567
dataset_size: 1080219
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/xayah_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of xayah (League of Legends)
This is the dataset of xayah (League of Legends), containing 44 images and their tags.
The core tags of this character are `long_hair, animal_ears, red_hair, facial_mark, yellow_eyes, breasts, bangs, hair_over_one_eye`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 71.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xayah_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 36.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xayah_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 106 | 77.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xayah_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 61.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xayah_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 106 | 113.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xayah_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/xayah_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, solo, looking_at_viewer, hood_up, feathers, ears_through_headwear, simple_background, smile |
| 1 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, nipples, nude, pussy, navel, uncensored, large_breasts, thighhighs, medium_breasts, on_back, pink_hair, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | hood_up | feathers | ears_through_headwear | simple_background | smile | nipples | nude | pussy | navel | uncensored | large_breasts | thighhighs | medium_breasts | on_back | pink_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:----------|:-----------|:------------------------|:--------------------|:--------|:----------|:-------|:--------|:--------|:-------------|:----------------|:-------------|:-----------------|:----------|:------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
joshuapsa/gpt-generated-news-paragraphs-v1.1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: class_name
dtype: string
- name: text
dtype: string
- name: aviation
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: cybersecurity
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: domestic_unrest_violence
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: extreme_weather
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: forced_labor
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: general_biz_trend
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: individual_accidents_tragedies
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: later_report
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: lawsuit_legal_insurance
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: leisure_other_news
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: maritime
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: pandemics_large_scale_diseases
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: railway
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: strike
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: trade_war_embargos_bans
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: transportation_trends_projects
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: war_conflict
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: warehouse_fire
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: class_index
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: label
sequence: int64
splits:
- name: train
num_bytes: 419816
num_examples: 720
- name: valid
num_bytes: 52468
num_examples: 90
- name: test
num_bytes: 52223
num_examples: 90
download_size: 179362
dataset_size: 524507
---
# Dataset Card for "gpt-generated-news-paragraphs-v1.1"
- This dataset was created solely for the purpose of code testing.
- This dataset was generated from prompting chatGPT to create sample pieces of news setences according to a topic.
- Sample prompt: "generate 50 paragraphs on the topic of "very recent breaking news on wars and conflicts events" with some sample location names. One example: "a missile struck near a residential building in Kiev last night. Russia denied Ukraine's accusations of attacking non-military targets""
- The output paragraphs were then used to construct huggingface dataset.
- Changes from v1.0: added column `class_name` for ease of use in downstream tasks
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lawful-good-project/ipc_decisions_4k_selected | ---
license: gpl-3.0
task_categories:
- text-generation
language:
- ru
tags:
- legal
size_categories:
- 1K<n<10K
---
Датасет судебных решений суда по интеллектуальным правам РФ с синтаксисом для дообучения с инструкциями. |
Nerfgun3/stripe_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/stripe_style/resolve/main/stripe_style_showcase.jpg"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Stripe Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/stripe_style/resolve/main/stripe_style_showcase.jpg"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"drawn by stripe_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"drawn by (stripe_style:0.8)"```
I trained the embedding two epochs until 5000 steps.
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-88000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 666898
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jxu124/refcocoplus | ---
dataset_info:
features:
- name: sent_ids
sequence: int64
- name: file_name
dtype: string
- name: ann_id
dtype: int64
- name: ref_id
dtype: int64
- name: image_id
dtype: int64
- name: split
dtype: string
- name: sentences
list:
- name: raw
dtype: string
- name: sent
dtype: string
- name: sent_id
dtype: int64
- name: tokens
sequence: string
- name: category_id
dtype: int64
- name: raw_anns
dtype: string
- name: raw_image_info
dtype: string
- name: raw_sentences
dtype: string
- name: image_path
dtype: string
- name: bbox
sequence: float64
- name: captions
sequence: string
- name: global_image_id
dtype: string
- name: anns_id
dtype: string
splits:
- name: train
num_bytes: 81937869
num_examples: 42278
- name: testB
num_bytes: 3273927
num_examples: 1798
- name: test
num_bytes: 3969265
num_examples: 1975
- name: validation
num_bytes: 7399541
num_examples: 3805
download_size: 39772801
dataset_size: 96580602
---
# Dataset Card for "refcocoplus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-lener_br-lener_br-c4cf3f-1771961516 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- lener_br
eval_info:
task: entity_extraction
model: Luciano/xlm-roberta-large-finetuned-lener-br
metrics: []
dataset_name: lener_br
dataset_config: lener_br
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: Luciano/xlm-roberta-large-finetuned-lener-br
* Dataset: lener_br
* Config: lener_br
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Luciano](https://huggingface.co/Luciano) for evaluating this model. |
maverickrzw/MUSE | ---
license: apache-2.0
---
|
anonymous-ai4science/ProFunc-9K | ---
license: cc-by-nc-sa-4.0
---
|
austindavis/chess_world_lichess_elite | ---
dataset_info:
features:
- name: Event
dtype: string
- name: Site
dtype: string
- name: Date
dtype: string
- name: Round
dtype: string
- name: White
dtype: string
- name: Black
dtype: string
- name: Result
dtype: string
- name: ECO
dtype: string
- name: WhiteElo
dtype: int64
- name: BlackElo
dtype: int64
- name: PlyCount
dtype: int64
- name: EventDate
dtype: string
- name: EventType
dtype: string
- name: transcript
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 157006085
num_examples: 234048
download_size: 78928248
dataset_size: 157006085
---
# Dataset Card for "chess_world_lichess_elite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhuynh95/Magicoder-Evol-Instruct-10000-CodeLlama-70b-tokenized-0.5-v2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 22972759
num_examples: 10000
download_size: 11249692
dataset_size: 22972759
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wannaphong/iapp_wiki_qa_squad_oa | ---
license: mit
language:
- th
tags:
- Open Assistant
---
This dataset is fork from [https://huggingface.co/datasets/iapp_wiki_qa_squad](https://huggingface.co/datasets/iapp_wiki_qa_squad) that made for Open Assistant.
Pull request: [Add iapp_wiki_qa_squad to datasets #1903 ](https://github.com/LAION-AI/Open-Assistant/pull/1903) |
DeepFoldProtein/2022-12-17-pdb-intersect-pisces_pc30_r2.5_processed_1024_ankh_test | ---
dataset_info:
features:
- name: pdb_id
dtype: string
- name: chain_code
dtype: string
- name: seq
dtype: string
- name: sst8
dtype: string
- name: sst3
dtype: string
- name: len_x
dtype: int64
- name: has_nonstd_aa
dtype: bool
- name: len_y
dtype: int64
- name: method
dtype: string
- name: resol
dtype: float64
- name: rfac
dtype: float64
- name: freerfac
dtype: float64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
- name: label
sequence: int64
splits:
- name: train
num_bytes: 829010
num_examples: 100
download_size: 66672
dataset_size: 829010
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
stoddur/med_chat_balanced | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 193000000.0
num_examples: 125000
download_size: 4450368
dataset_size: 193000000.0
---
# Dataset Card for "med_chat_balanced"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/art-faces-hard-prompt | ---
dataset_info:
features:
- name: label
dtype: string
- name: optimal_prompt
dtype: string
- name: splash
dtype: image
- name: tile
dtype: image
- name: subject
dtype: string
splits:
- name: train
num_bytes: 45792564.0
num_examples: 80
download_size: 45794708
dataset_size: 45792564.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lakera/gandalf_ignore_instructions | ---
language:
- en
license: mit
size_categories:
- 1K<n<10K
dataset_info:
features:
- name: text
dtype: string
- name: similarity
dtype: float64
splits:
- name: train
num_bytes: 66400
num_examples: 777
- name: validation
num_bytes: 9633
num_examples: 111
- name: test
num_bytes: 9747
num_examples: 112
download_size: 51515
dataset_size: 85780
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
tags:
- prompt injection
---
# gandalf_ignore_instructions

This is a dataset of prompt injections from [Gandalf](https://gandalf.lakera.ai/) by [Lakera](https://www.lakera.ai/).
Note that we might update the dataset occasionally by cleaning the data or adding more samples.
## How the data was obtained
There are millions of prompts and many of them are not actual prompt injections (people ask Gandalf all kinds of things).
We used the following process to obtain relevant data:
- Start with all prompts submitted to Gandalf in July 2023.
- Use [OpenAI text embeddings](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) (`text-embedding-ada-002`) to compute the similarity between the prompts and the phrase "Ignore all previous instructions".
- Filter to those prompts that have similarity of at least 0.825.
- Use [Lakera Guard](https://platform.lakera.ai/docs)'s personally identifiable information detector to filter out prompts that potentially contain PII data.
- Remove near-duplicates from the data (prompts that differ only by a few letters) using an approximate algorithm. This helps reduce leakage between the data splits.
- Sample 1000 prompts.
- Split the data into train-val-test with an 80/10/10 ratio. Each sample is assigned independently so the size of the train split is not _exactly_ 80% and so on.
Note that there is a small amount of noise in the data since an automatic method was used to obtain it: a few of the samples might not be real prompt injections.
## Citation
If you use this dataset in your research, please cite it as
```
@InProceedings{gandalf_ignore_instructions,
title = {gandalf_ignore_instructions},
author={Lakera AI (https://www.lakera.ai)},
year={2023}
}
```
## Licensing Information
gandalf_ignore_instructions is distributed under the [MIT License](https://opensource.org/license/mit/). |
CyberHarem/houston_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of houston/ヒューストン/休斯敦 (Azur Lane)
This is the dataset of houston/ヒューストン/休斯敦 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `green_eyes, pink_hair, long_hair, two_side_up, breasts, ahoge, bangs, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 15.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 10.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 22.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 14.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 28.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/houston_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, blush, bare_shoulders, navel, smile, solo, star_(symbol), open_mouth, collarbone, shorts, black_choker, midriff, criss-cross_halter, red_gloves, simple_background, stomach, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | blush | bare_shoulders | navel | smile | solo | star_(symbol) | open_mouth | collarbone | shorts | black_choker | midriff | criss-cross_halter | red_gloves | simple_background | stomach | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-----------------|:--------|:--------|:-------|:----------------|:-------------|:-------------|:---------|:---------------|:----------|:---------------------|:-------------|:--------------------|:----------|:-------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
princeton-nlp/TutorEval | ---
dataset_info:
features:
- name: chapter
dtype: string
- name: question
dtype: string
- name: key_points
dtype: string
- name: closed_book
dtype: bool
- name: answer_in_chapter
dtype: bool
- name: misleading_question
dtype: bool
- name: difficulty
dtype: string
- name: domain
dtype: string
- name: path_to_chapter
dtype: string
splits:
- name: train
num_bytes: 10429630
num_examples: 834
download_size: 1337601
dataset_size: 10429630
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Dampish/birdie | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 3788383238
num_examples: 299800
download_size: 1204729544
dataset_size: 3788383238
---
|
smaciu/bee-wings-large | ---
task_categories:
- feature-extraction
pretty_name: Collection of wing images for conservation of honey bees (Apis mellifera) biodiversity in Europe
size_categories:
- 10K<n<100K
---
Collection of wing images for conservation of honey bees (Apis mellifera) biodiversity in Europe
https://zenodo.org/record/7244070
|
Xieyiyiyi/cceee | ---
license: bsl-1.0
---
|
liuyanchen1015/VALUE_wnli_drop_aux | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 370
num_examples: 3
- name: test
num_bytes: 2929
num_examples: 12
- name: train
num_bytes: 10083
num_examples: 66
download_size: 12687
dataset_size: 13382
---
# Dataset Card for "VALUE_wnli_drop_aux"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
devmehta787/wav2vec2-xlsr-hindi | ---
license: afl-3.0
---
|
mHossain/buet_new_para_detection_data_v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7544025.9
num_examples: 36000
- name: test
num_bytes: 838225.1
num_examples: 4000
download_size: 3651001
dataset_size: 8382251.0
---
# Dataset Card for "buet_new_para_detection_data_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/poor4kids_2_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2371
num_examples: 10
download_size: 3087
dataset_size: 2371
---
# Dataset Card for "poor4kids_2_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-squad_v2-94d8b010-11595543 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: autoevaluate/distilbert-base-cased-distilled-squad
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: autoevaluate/distilbert-base-cased-distilled-squad
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
CVasNLPExperiments/docvqa_valid_google_flan_t5_xxl_mode_OCR_VQA_Q_rices_ns_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 1094
num_examples: 10
download_size: 3585
dataset_size: 1094
configs:
- config_name: default
data_files:
- split: fewshot_0
path: data/fewshot_0-*
---
|
Asimok/KGLQA-AblationStudy | ---
license: apache-2.0
---
|
SyedAunZaidi/cv-corpus-13.0-ur | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: client_id
dtype: string
- name: path
dtype: string
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accents
dtype: string
- name: variant
dtype: float64
- name: locale
dtype: string
- name: segment
dtype: float64
- name: config
dtype: string
splits:
- name: train
num_bytes: 108669095.966
num_examples: 4129
- name: test
num_bytes: 80563643.32
num_examples: 3265
- name: validation
num_bytes: 80563643.32
num_examples: 3265
download_size: 270064851
dataset_size: 269796382.606
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
jgwill/gia-young-picasso-v02b-201210-864 | ---
license: artistic-2.0
---
|
EgilKarlsen/CSIC_GPTNEO_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: '768'
dtype: float32
- name: '769'
dtype: float32
- name: '770'
dtype: float32
- name: '771'
dtype: float32
- name: '772'
dtype: float32
- name: '773'
dtype: float32
- name: '774'
dtype: float32
- name: '775'
dtype: float32
- name: '776'
dtype: float32
- name: '777'
dtype: float32
- name: '778'
dtype: float32
- name: '779'
dtype: float32
- name: '780'
dtype: float32
- name: '781'
dtype: float32
- name: '782'
dtype: float32
- name: '783'
dtype: float32
- name: '784'
dtype: float32
- name: '785'
dtype: float32
- name: '786'
dtype: float32
- name: '787'
dtype: float32
- name: '788'
dtype: float32
- name: '789'
dtype: float32
- name: '790'
dtype: float32
- name: '791'
dtype: float32
- name: '792'
dtype: float32
- name: '793'
dtype: float32
- name: '794'
dtype: float32
- name: '795'
dtype: float32
- name: '796'
dtype: float32
- name: '797'
dtype: float32
- name: '798'
dtype: float32
- name: '799'
dtype: float32
- name: '800'
dtype: float32
- name: '801'
dtype: float32
- name: '802'
dtype: float32
- name: '803'
dtype: float32
- name: '804'
dtype: float32
- name: '805'
dtype: float32
- name: '806'
dtype: float32
- name: '807'
dtype: float32
- name: '808'
dtype: float32
- name: '809'
dtype: float32
- name: '810'
dtype: float32
- name: '811'
dtype: float32
- name: '812'
dtype: float32
- name: '813'
dtype: float32
- name: '814'
dtype: float32
- name: '815'
dtype: float32
- name: '816'
dtype: float32
- name: '817'
dtype: float32
- name: '818'
dtype: float32
- name: '819'
dtype: float32
- name: '820'
dtype: float32
- name: '821'
dtype: float32
- name: '822'
dtype: float32
- name: '823'
dtype: float32
- name: '824'
dtype: float32
- name: '825'
dtype: float32
- name: '826'
dtype: float32
- name: '827'
dtype: float32
- name: '828'
dtype: float32
- name: '829'
dtype: float32
- name: '830'
dtype: float32
- name: '831'
dtype: float32
- name: '832'
dtype: float32
- name: '833'
dtype: float32
- name: '834'
dtype: float32
- name: '835'
dtype: float32
- name: '836'
dtype: float32
- name: '837'
dtype: float32
- name: '838'
dtype: float32
- name: '839'
dtype: float32
- name: '840'
dtype: float32
- name: '841'
dtype: float32
- name: '842'
dtype: float32
- name: '843'
dtype: float32
- name: '844'
dtype: float32
- name: '845'
dtype: float32
- name: '846'
dtype: float32
- name: '847'
dtype: float32
- name: '848'
dtype: float32
- name: '849'
dtype: float32
- name: '850'
dtype: float32
- name: '851'
dtype: float32
- name: '852'
dtype: float32
- name: '853'
dtype: float32
- name: '854'
dtype: float32
- name: '855'
dtype: float32
- name: '856'
dtype: float32
- name: '857'
dtype: float32
- name: '858'
dtype: float32
- name: '859'
dtype: float32
- name: '860'
dtype: float32
- name: '861'
dtype: float32
- name: '862'
dtype: float32
- name: '863'
dtype: float32
- name: '864'
dtype: float32
- name: '865'
dtype: float32
- name: '866'
dtype: float32
- name: '867'
dtype: float32
- name: '868'
dtype: float32
- name: '869'
dtype: float32
- name: '870'
dtype: float32
- name: '871'
dtype: float32
- name: '872'
dtype: float32
- name: '873'
dtype: float32
- name: '874'
dtype: float32
- name: '875'
dtype: float32
- name: '876'
dtype: float32
- name: '877'
dtype: float32
- name: '878'
dtype: float32
- name: '879'
dtype: float32
- name: '880'
dtype: float32
- name: '881'
dtype: float32
- name: '882'
dtype: float32
- name: '883'
dtype: float32
- name: '884'
dtype: float32
- name: '885'
dtype: float32
- name: '886'
dtype: float32
- name: '887'
dtype: float32
- name: '888'
dtype: float32
- name: '889'
dtype: float32
- name: '890'
dtype: float32
- name: '891'
dtype: float32
- name: '892'
dtype: float32
- name: '893'
dtype: float32
- name: '894'
dtype: float32
- name: '895'
dtype: float32
- name: '896'
dtype: float32
- name: '897'
dtype: float32
- name: '898'
dtype: float32
- name: '899'
dtype: float32
- name: '900'
dtype: float32
- name: '901'
dtype: float32
- name: '902'
dtype: float32
- name: '903'
dtype: float32
- name: '904'
dtype: float32
- name: '905'
dtype: float32
- name: '906'
dtype: float32
- name: '907'
dtype: float32
- name: '908'
dtype: float32
- name: '909'
dtype: float32
- name: '910'
dtype: float32
- name: '911'
dtype: float32
- name: '912'
dtype: float32
- name: '913'
dtype: float32
- name: '914'
dtype: float32
- name: '915'
dtype: float32
- name: '916'
dtype: float32
- name: '917'
dtype: float32
- name: '918'
dtype: float32
- name: '919'
dtype: float32
- name: '920'
dtype: float32
- name: '921'
dtype: float32
- name: '922'
dtype: float32
- name: '923'
dtype: float32
- name: '924'
dtype: float32
- name: '925'
dtype: float32
- name: '926'
dtype: float32
- name: '927'
dtype: float32
- name: '928'
dtype: float32
- name: '929'
dtype: float32
- name: '930'
dtype: float32
- name: '931'
dtype: float32
- name: '932'
dtype: float32
- name: '933'
dtype: float32
- name: '934'
dtype: float32
- name: '935'
dtype: float32
- name: '936'
dtype: float32
- name: '937'
dtype: float32
- name: '938'
dtype: float32
- name: '939'
dtype: float32
- name: '940'
dtype: float32
- name: '941'
dtype: float32
- name: '942'
dtype: float32
- name: '943'
dtype: float32
- name: '944'
dtype: float32
- name: '945'
dtype: float32
- name: '946'
dtype: float32
- name: '947'
dtype: float32
- name: '948'
dtype: float32
- name: '949'
dtype: float32
- name: '950'
dtype: float32
- name: '951'
dtype: float32
- name: '952'
dtype: float32
- name: '953'
dtype: float32
- name: '954'
dtype: float32
- name: '955'
dtype: float32
- name: '956'
dtype: float32
- name: '957'
dtype: float32
- name: '958'
dtype: float32
- name: '959'
dtype: float32
- name: '960'
dtype: float32
- name: '961'
dtype: float32
- name: '962'
dtype: float32
- name: '963'
dtype: float32
- name: '964'
dtype: float32
- name: '965'
dtype: float32
- name: '966'
dtype: float32
- name: '967'
dtype: float32
- name: '968'
dtype: float32
- name: '969'
dtype: float32
- name: '970'
dtype: float32
- name: '971'
dtype: float32
- name: '972'
dtype: float32
- name: '973'
dtype: float32
- name: '974'
dtype: float32
- name: '975'
dtype: float32
- name: '976'
dtype: float32
- name: '977'
dtype: float32
- name: '978'
dtype: float32
- name: '979'
dtype: float32
- name: '980'
dtype: float32
- name: '981'
dtype: float32
- name: '982'
dtype: float32
- name: '983'
dtype: float32
- name: '984'
dtype: float32
- name: '985'
dtype: float32
- name: '986'
dtype: float32
- name: '987'
dtype: float32
- name: '988'
dtype: float32
- name: '989'
dtype: float32
- name: '990'
dtype: float32
- name: '991'
dtype: float32
- name: '992'
dtype: float32
- name: '993'
dtype: float32
- name: '994'
dtype: float32
- name: '995'
dtype: float32
- name: '996'
dtype: float32
- name: '997'
dtype: float32
- name: '998'
dtype: float32
- name: '999'
dtype: float32
- name: '1000'
dtype: float32
- name: '1001'
dtype: float32
- name: '1002'
dtype: float32
- name: '1003'
dtype: float32
- name: '1004'
dtype: float32
- name: '1005'
dtype: float32
- name: '1006'
dtype: float32
- name: '1007'
dtype: float32
- name: '1008'
dtype: float32
- name: '1009'
dtype: float32
- name: '1010'
dtype: float32
- name: '1011'
dtype: float32
- name: '1012'
dtype: float32
- name: '1013'
dtype: float32
- name: '1014'
dtype: float32
- name: '1015'
dtype: float32
- name: '1016'
dtype: float32
- name: '1017'
dtype: float32
- name: '1018'
dtype: float32
- name: '1019'
dtype: float32
- name: '1020'
dtype: float32
- name: '1021'
dtype: float32
- name: '1022'
dtype: float32
- name: '1023'
dtype: float32
- name: '1024'
dtype: float32
- name: '1025'
dtype: float32
- name: '1026'
dtype: float32
- name: '1027'
dtype: float32
- name: '1028'
dtype: float32
- name: '1029'
dtype: float32
- name: '1030'
dtype: float32
- name: '1031'
dtype: float32
- name: '1032'
dtype: float32
- name: '1033'
dtype: float32
- name: '1034'
dtype: float32
- name: '1035'
dtype: float32
- name: '1036'
dtype: float32
- name: '1037'
dtype: float32
- name: '1038'
dtype: float32
- name: '1039'
dtype: float32
- name: '1040'
dtype: float32
- name: '1041'
dtype: float32
- name: '1042'
dtype: float32
- name: '1043'
dtype: float32
- name: '1044'
dtype: float32
- name: '1045'
dtype: float32
- name: '1046'
dtype: float32
- name: '1047'
dtype: float32
- name: '1048'
dtype: float32
- name: '1049'
dtype: float32
- name: '1050'
dtype: float32
- name: '1051'
dtype: float32
- name: '1052'
dtype: float32
- name: '1053'
dtype: float32
- name: '1054'
dtype: float32
- name: '1055'
dtype: float32
- name: '1056'
dtype: float32
- name: '1057'
dtype: float32
- name: '1058'
dtype: float32
- name: '1059'
dtype: float32
- name: '1060'
dtype: float32
- name: '1061'
dtype: float32
- name: '1062'
dtype: float32
- name: '1063'
dtype: float32
- name: '1064'
dtype: float32
- name: '1065'
dtype: float32
- name: '1066'
dtype: float32
- name: '1067'
dtype: float32
- name: '1068'
dtype: float32
- name: '1069'
dtype: float32
- name: '1070'
dtype: float32
- name: '1071'
dtype: float32
- name: '1072'
dtype: float32
- name: '1073'
dtype: float32
- name: '1074'
dtype: float32
- name: '1075'
dtype: float32
- name: '1076'
dtype: float32
- name: '1077'
dtype: float32
- name: '1078'
dtype: float32
- name: '1079'
dtype: float32
- name: '1080'
dtype: float32
- name: '1081'
dtype: float32
- name: '1082'
dtype: float32
- name: '1083'
dtype: float32
- name: '1084'
dtype: float32
- name: '1085'
dtype: float32
- name: '1086'
dtype: float32
- name: '1087'
dtype: float32
- name: '1088'
dtype: float32
- name: '1089'
dtype: float32
- name: '1090'
dtype: float32
- name: '1091'
dtype: float32
- name: '1092'
dtype: float32
- name: '1093'
dtype: float32
- name: '1094'
dtype: float32
- name: '1095'
dtype: float32
- name: '1096'
dtype: float32
- name: '1097'
dtype: float32
- name: '1098'
dtype: float32
- name: '1099'
dtype: float32
- name: '1100'
dtype: float32
- name: '1101'
dtype: float32
- name: '1102'
dtype: float32
- name: '1103'
dtype: float32
- name: '1104'
dtype: float32
- name: '1105'
dtype: float32
- name: '1106'
dtype: float32
- name: '1107'
dtype: float32
- name: '1108'
dtype: float32
- name: '1109'
dtype: float32
- name: '1110'
dtype: float32
- name: '1111'
dtype: float32
- name: '1112'
dtype: float32
- name: '1113'
dtype: float32
- name: '1114'
dtype: float32
- name: '1115'
dtype: float32
- name: '1116'
dtype: float32
- name: '1117'
dtype: float32
- name: '1118'
dtype: float32
- name: '1119'
dtype: float32
- name: '1120'
dtype: float32
- name: '1121'
dtype: float32
- name: '1122'
dtype: float32
- name: '1123'
dtype: float32
- name: '1124'
dtype: float32
- name: '1125'
dtype: float32
- name: '1126'
dtype: float32
- name: '1127'
dtype: float32
- name: '1128'
dtype: float32
- name: '1129'
dtype: float32
- name: '1130'
dtype: float32
- name: '1131'
dtype: float32
- name: '1132'
dtype: float32
- name: '1133'
dtype: float32
- name: '1134'
dtype: float32
- name: '1135'
dtype: float32
- name: '1136'
dtype: float32
- name: '1137'
dtype: float32
- name: '1138'
dtype: float32
- name: '1139'
dtype: float32
- name: '1140'
dtype: float32
- name: '1141'
dtype: float32
- name: '1142'
dtype: float32
- name: '1143'
dtype: float32
- name: '1144'
dtype: float32
- name: '1145'
dtype: float32
- name: '1146'
dtype: float32
- name: '1147'
dtype: float32
- name: '1148'
dtype: float32
- name: '1149'
dtype: float32
- name: '1150'
dtype: float32
- name: '1151'
dtype: float32
- name: '1152'
dtype: float32
- name: '1153'
dtype: float32
- name: '1154'
dtype: float32
- name: '1155'
dtype: float32
- name: '1156'
dtype: float32
- name: '1157'
dtype: float32
- name: '1158'
dtype: float32
- name: '1159'
dtype: float32
- name: '1160'
dtype: float32
- name: '1161'
dtype: float32
- name: '1162'
dtype: float32
- name: '1163'
dtype: float32
- name: '1164'
dtype: float32
- name: '1165'
dtype: float32
- name: '1166'
dtype: float32
- name: '1167'
dtype: float32
- name: '1168'
dtype: float32
- name: '1169'
dtype: float32
- name: '1170'
dtype: float32
- name: '1171'
dtype: float32
- name: '1172'
dtype: float32
- name: '1173'
dtype: float32
- name: '1174'
dtype: float32
- name: '1175'
dtype: float32
- name: '1176'
dtype: float32
- name: '1177'
dtype: float32
- name: '1178'
dtype: float32
- name: '1179'
dtype: float32
- name: '1180'
dtype: float32
- name: '1181'
dtype: float32
- name: '1182'
dtype: float32
- name: '1183'
dtype: float32
- name: '1184'
dtype: float32
- name: '1185'
dtype: float32
- name: '1186'
dtype: float32
- name: '1187'
dtype: float32
- name: '1188'
dtype: float32
- name: '1189'
dtype: float32
- name: '1190'
dtype: float32
- name: '1191'
dtype: float32
- name: '1192'
dtype: float32
- name: '1193'
dtype: float32
- name: '1194'
dtype: float32
- name: '1195'
dtype: float32
- name: '1196'
dtype: float32
- name: '1197'
dtype: float32
- name: '1198'
dtype: float32
- name: '1199'
dtype: float32
- name: '1200'
dtype: float32
- name: '1201'
dtype: float32
- name: '1202'
dtype: float32
- name: '1203'
dtype: float32
- name: '1204'
dtype: float32
- name: '1205'
dtype: float32
- name: '1206'
dtype: float32
- name: '1207'
dtype: float32
- name: '1208'
dtype: float32
- name: '1209'
dtype: float32
- name: '1210'
dtype: float32
- name: '1211'
dtype: float32
- name: '1212'
dtype: float32
- name: '1213'
dtype: float32
- name: '1214'
dtype: float32
- name: '1215'
dtype: float32
- name: '1216'
dtype: float32
- name: '1217'
dtype: float32
- name: '1218'
dtype: float32
- name: '1219'
dtype: float32
- name: '1220'
dtype: float32
- name: '1221'
dtype: float32
- name: '1222'
dtype: float32
- name: '1223'
dtype: float32
- name: '1224'
dtype: float32
- name: '1225'
dtype: float32
- name: '1226'
dtype: float32
- name: '1227'
dtype: float32
- name: '1228'
dtype: float32
- name: '1229'
dtype: float32
- name: '1230'
dtype: float32
- name: '1231'
dtype: float32
- name: '1232'
dtype: float32
- name: '1233'
dtype: float32
- name: '1234'
dtype: float32
- name: '1235'
dtype: float32
- name: '1236'
dtype: float32
- name: '1237'
dtype: float32
- name: '1238'
dtype: float32
- name: '1239'
dtype: float32
- name: '1240'
dtype: float32
- name: '1241'
dtype: float32
- name: '1242'
dtype: float32
- name: '1243'
dtype: float32
- name: '1244'
dtype: float32
- name: '1245'
dtype: float32
- name: '1246'
dtype: float32
- name: '1247'
dtype: float32
- name: '1248'
dtype: float32
- name: '1249'
dtype: float32
- name: '1250'
dtype: float32
- name: '1251'
dtype: float32
- name: '1252'
dtype: float32
- name: '1253'
dtype: float32
- name: '1254'
dtype: float32
- name: '1255'
dtype: float32
- name: '1256'
dtype: float32
- name: '1257'
dtype: float32
- name: '1258'
dtype: float32
- name: '1259'
dtype: float32
- name: '1260'
dtype: float32
- name: '1261'
dtype: float32
- name: '1262'
dtype: float32
- name: '1263'
dtype: float32
- name: '1264'
dtype: float32
- name: '1265'
dtype: float32
- name: '1266'
dtype: float32
- name: '1267'
dtype: float32
- name: '1268'
dtype: float32
- name: '1269'
dtype: float32
- name: '1270'
dtype: float32
- name: '1271'
dtype: float32
- name: '1272'
dtype: float32
- name: '1273'
dtype: float32
- name: '1274'
dtype: float32
- name: '1275'
dtype: float32
- name: '1276'
dtype: float32
- name: '1277'
dtype: float32
- name: '1278'
dtype: float32
- name: '1279'
dtype: float32
- name: '1280'
dtype: float32
- name: '1281'
dtype: float32
- name: '1282'
dtype: float32
- name: '1283'
dtype: float32
- name: '1284'
dtype: float32
- name: '1285'
dtype: float32
- name: '1286'
dtype: float32
- name: '1287'
dtype: float32
- name: '1288'
dtype: float32
- name: '1289'
dtype: float32
- name: '1290'
dtype: float32
- name: '1291'
dtype: float32
- name: '1292'
dtype: float32
- name: '1293'
dtype: float32
- name: '1294'
dtype: float32
- name: '1295'
dtype: float32
- name: '1296'
dtype: float32
- name: '1297'
dtype: float32
- name: '1298'
dtype: float32
- name: '1299'
dtype: float32
- name: '1300'
dtype: float32
- name: '1301'
dtype: float32
- name: '1302'
dtype: float32
- name: '1303'
dtype: float32
- name: '1304'
dtype: float32
- name: '1305'
dtype: float32
- name: '1306'
dtype: float32
- name: '1307'
dtype: float32
- name: '1308'
dtype: float32
- name: '1309'
dtype: float32
- name: '1310'
dtype: float32
- name: '1311'
dtype: float32
- name: '1312'
dtype: float32
- name: '1313'
dtype: float32
- name: '1314'
dtype: float32
- name: '1315'
dtype: float32
- name: '1316'
dtype: float32
- name: '1317'
dtype: float32
- name: '1318'
dtype: float32
- name: '1319'
dtype: float32
- name: '1320'
dtype: float32
- name: '1321'
dtype: float32
- name: '1322'
dtype: float32
- name: '1323'
dtype: float32
- name: '1324'
dtype: float32
- name: '1325'
dtype: float32
- name: '1326'
dtype: float32
- name: '1327'
dtype: float32
- name: '1328'
dtype: float32
- name: '1329'
dtype: float32
- name: '1330'
dtype: float32
- name: '1331'
dtype: float32
- name: '1332'
dtype: float32
- name: '1333'
dtype: float32
- name: '1334'
dtype: float32
- name: '1335'
dtype: float32
- name: '1336'
dtype: float32
- name: '1337'
dtype: float32
- name: '1338'
dtype: float32
- name: '1339'
dtype: float32
- name: '1340'
dtype: float32
- name: '1341'
dtype: float32
- name: '1342'
dtype: float32
- name: '1343'
dtype: float32
- name: '1344'
dtype: float32
- name: '1345'
dtype: float32
- name: '1346'
dtype: float32
- name: '1347'
dtype: float32
- name: '1348'
dtype: float32
- name: '1349'
dtype: float32
- name: '1350'
dtype: float32
- name: '1351'
dtype: float32
- name: '1352'
dtype: float32
- name: '1353'
dtype: float32
- name: '1354'
dtype: float32
- name: '1355'
dtype: float32
- name: '1356'
dtype: float32
- name: '1357'
dtype: float32
- name: '1358'
dtype: float32
- name: '1359'
dtype: float32
- name: '1360'
dtype: float32
- name: '1361'
dtype: float32
- name: '1362'
dtype: float32
- name: '1363'
dtype: float32
- name: '1364'
dtype: float32
- name: '1365'
dtype: float32
- name: '1366'
dtype: float32
- name: '1367'
dtype: float32
- name: '1368'
dtype: float32
- name: '1369'
dtype: float32
- name: '1370'
dtype: float32
- name: '1371'
dtype: float32
- name: '1372'
dtype: float32
- name: '1373'
dtype: float32
- name: '1374'
dtype: float32
- name: '1375'
dtype: float32
- name: '1376'
dtype: float32
- name: '1377'
dtype: float32
- name: '1378'
dtype: float32
- name: '1379'
dtype: float32
- name: '1380'
dtype: float32
- name: '1381'
dtype: float32
- name: '1382'
dtype: float32
- name: '1383'
dtype: float32
- name: '1384'
dtype: float32
- name: '1385'
dtype: float32
- name: '1386'
dtype: float32
- name: '1387'
dtype: float32
- name: '1388'
dtype: float32
- name: '1389'
dtype: float32
- name: '1390'
dtype: float32
- name: '1391'
dtype: float32
- name: '1392'
dtype: float32
- name: '1393'
dtype: float32
- name: '1394'
dtype: float32
- name: '1395'
dtype: float32
- name: '1396'
dtype: float32
- name: '1397'
dtype: float32
- name: '1398'
dtype: float32
- name: '1399'
dtype: float32
- name: '1400'
dtype: float32
- name: '1401'
dtype: float32
- name: '1402'
dtype: float32
- name: '1403'
dtype: float32
- name: '1404'
dtype: float32
- name: '1405'
dtype: float32
- name: '1406'
dtype: float32
- name: '1407'
dtype: float32
- name: '1408'
dtype: float32
- name: '1409'
dtype: float32
- name: '1410'
dtype: float32
- name: '1411'
dtype: float32
- name: '1412'
dtype: float32
- name: '1413'
dtype: float32
- name: '1414'
dtype: float32
- name: '1415'
dtype: float32
- name: '1416'
dtype: float32
- name: '1417'
dtype: float32
- name: '1418'
dtype: float32
- name: '1419'
dtype: float32
- name: '1420'
dtype: float32
- name: '1421'
dtype: float32
- name: '1422'
dtype: float32
- name: '1423'
dtype: float32
- name: '1424'
dtype: float32
- name: '1425'
dtype: float32
- name: '1426'
dtype: float32
- name: '1427'
dtype: float32
- name: '1428'
dtype: float32
- name: '1429'
dtype: float32
- name: '1430'
dtype: float32
- name: '1431'
dtype: float32
- name: '1432'
dtype: float32
- name: '1433'
dtype: float32
- name: '1434'
dtype: float32
- name: '1435'
dtype: float32
- name: '1436'
dtype: float32
- name: '1437'
dtype: float32
- name: '1438'
dtype: float32
- name: '1439'
dtype: float32
- name: '1440'
dtype: float32
- name: '1441'
dtype: float32
- name: '1442'
dtype: float32
- name: '1443'
dtype: float32
- name: '1444'
dtype: float32
- name: '1445'
dtype: float32
- name: '1446'
dtype: float32
- name: '1447'
dtype: float32
- name: '1448'
dtype: float32
- name: '1449'
dtype: float32
- name: '1450'
dtype: float32
- name: '1451'
dtype: float32
- name: '1452'
dtype: float32
- name: '1453'
dtype: float32
- name: '1454'
dtype: float32
- name: '1455'
dtype: float32
- name: '1456'
dtype: float32
- name: '1457'
dtype: float32
- name: '1458'
dtype: float32
- name: '1459'
dtype: float32
- name: '1460'
dtype: float32
- name: '1461'
dtype: float32
- name: '1462'
dtype: float32
- name: '1463'
dtype: float32
- name: '1464'
dtype: float32
- name: '1465'
dtype: float32
- name: '1466'
dtype: float32
- name: '1467'
dtype: float32
- name: '1468'
dtype: float32
- name: '1469'
dtype: float32
- name: '1470'
dtype: float32
- name: '1471'
dtype: float32
- name: '1472'
dtype: float32
- name: '1473'
dtype: float32
- name: '1474'
dtype: float32
- name: '1475'
dtype: float32
- name: '1476'
dtype: float32
- name: '1477'
dtype: float32
- name: '1478'
dtype: float32
- name: '1479'
dtype: float32
- name: '1480'
dtype: float32
- name: '1481'
dtype: float32
- name: '1482'
dtype: float32
- name: '1483'
dtype: float32
- name: '1484'
dtype: float32
- name: '1485'
dtype: float32
- name: '1486'
dtype: float32
- name: '1487'
dtype: float32
- name: '1488'
dtype: float32
- name: '1489'
dtype: float32
- name: '1490'
dtype: float32
- name: '1491'
dtype: float32
- name: '1492'
dtype: float32
- name: '1493'
dtype: float32
- name: '1494'
dtype: float32
- name: '1495'
dtype: float32
- name: '1496'
dtype: float32
- name: '1497'
dtype: float32
- name: '1498'
dtype: float32
- name: '1499'
dtype: float32
- name: '1500'
dtype: float32
- name: '1501'
dtype: float32
- name: '1502'
dtype: float32
- name: '1503'
dtype: float32
- name: '1504'
dtype: float32
- name: '1505'
dtype: float32
- name: '1506'
dtype: float32
- name: '1507'
dtype: float32
- name: '1508'
dtype: float32
- name: '1509'
dtype: float32
- name: '1510'
dtype: float32
- name: '1511'
dtype: float32
- name: '1512'
dtype: float32
- name: '1513'
dtype: float32
- name: '1514'
dtype: float32
- name: '1515'
dtype: float32
- name: '1516'
dtype: float32
- name: '1517'
dtype: float32
- name: '1518'
dtype: float32
- name: '1519'
dtype: float32
- name: '1520'
dtype: float32
- name: '1521'
dtype: float32
- name: '1522'
dtype: float32
- name: '1523'
dtype: float32
- name: '1524'
dtype: float32
- name: '1525'
dtype: float32
- name: '1526'
dtype: float32
- name: '1527'
dtype: float32
- name: '1528'
dtype: float32
- name: '1529'
dtype: float32
- name: '1530'
dtype: float32
- name: '1531'
dtype: float32
- name: '1532'
dtype: float32
- name: '1533'
dtype: float32
- name: '1534'
dtype: float32
- name: '1535'
dtype: float32
- name: '1536'
dtype: float32
- name: '1537'
dtype: float32
- name: '1538'
dtype: float32
- name: '1539'
dtype: float32
- name: '1540'
dtype: float32
- name: '1541'
dtype: float32
- name: '1542'
dtype: float32
- name: '1543'
dtype: float32
- name: '1544'
dtype: float32
- name: '1545'
dtype: float32
- name: '1546'
dtype: float32
- name: '1547'
dtype: float32
- name: '1548'
dtype: float32
- name: '1549'
dtype: float32
- name: '1550'
dtype: float32
- name: '1551'
dtype: float32
- name: '1552'
dtype: float32
- name: '1553'
dtype: float32
- name: '1554'
dtype: float32
- name: '1555'
dtype: float32
- name: '1556'
dtype: float32
- name: '1557'
dtype: float32
- name: '1558'
dtype: float32
- name: '1559'
dtype: float32
- name: '1560'
dtype: float32
- name: '1561'
dtype: float32
- name: '1562'
dtype: float32
- name: '1563'
dtype: float32
- name: '1564'
dtype: float32
- name: '1565'
dtype: float32
- name: '1566'
dtype: float32
- name: '1567'
dtype: float32
- name: '1568'
dtype: float32
- name: '1569'
dtype: float32
- name: '1570'
dtype: float32
- name: '1571'
dtype: float32
- name: '1572'
dtype: float32
- name: '1573'
dtype: float32
- name: '1574'
dtype: float32
- name: '1575'
dtype: float32
- name: '1576'
dtype: float32
- name: '1577'
dtype: float32
- name: '1578'
dtype: float32
- name: '1579'
dtype: float32
- name: '1580'
dtype: float32
- name: '1581'
dtype: float32
- name: '1582'
dtype: float32
- name: '1583'
dtype: float32
- name: '1584'
dtype: float32
- name: '1585'
dtype: float32
- name: '1586'
dtype: float32
- name: '1587'
dtype: float32
- name: '1588'
dtype: float32
- name: '1589'
dtype: float32
- name: '1590'
dtype: float32
- name: '1591'
dtype: float32
- name: '1592'
dtype: float32
- name: '1593'
dtype: float32
- name: '1594'
dtype: float32
- name: '1595'
dtype: float32
- name: '1596'
dtype: float32
- name: '1597'
dtype: float32
- name: '1598'
dtype: float32
- name: '1599'
dtype: float32
- name: '1600'
dtype: float32
- name: '1601'
dtype: float32
- name: '1602'
dtype: float32
- name: '1603'
dtype: float32
- name: '1604'
dtype: float32
- name: '1605'
dtype: float32
- name: '1606'
dtype: float32
- name: '1607'
dtype: float32
- name: '1608'
dtype: float32
- name: '1609'
dtype: float32
- name: '1610'
dtype: float32
- name: '1611'
dtype: float32
- name: '1612'
dtype: float32
- name: '1613'
dtype: float32
- name: '1614'
dtype: float32
- name: '1615'
dtype: float32
- name: '1616'
dtype: float32
- name: '1617'
dtype: float32
- name: '1618'
dtype: float32
- name: '1619'
dtype: float32
- name: '1620'
dtype: float32
- name: '1621'
dtype: float32
- name: '1622'
dtype: float32
- name: '1623'
dtype: float32
- name: '1624'
dtype: float32
- name: '1625'
dtype: float32
- name: '1626'
dtype: float32
- name: '1627'
dtype: float32
- name: '1628'
dtype: float32
- name: '1629'
dtype: float32
- name: '1630'
dtype: float32
- name: '1631'
dtype: float32
- name: '1632'
dtype: float32
- name: '1633'
dtype: float32
- name: '1634'
dtype: float32
- name: '1635'
dtype: float32
- name: '1636'
dtype: float32
- name: '1637'
dtype: float32
- name: '1638'
dtype: float32
- name: '1639'
dtype: float32
- name: '1640'
dtype: float32
- name: '1641'
dtype: float32
- name: '1642'
dtype: float32
- name: '1643'
dtype: float32
- name: '1644'
dtype: float32
- name: '1645'
dtype: float32
- name: '1646'
dtype: float32
- name: '1647'
dtype: float32
- name: '1648'
dtype: float32
- name: '1649'
dtype: float32
- name: '1650'
dtype: float32
- name: '1651'
dtype: float32
- name: '1652'
dtype: float32
- name: '1653'
dtype: float32
- name: '1654'
dtype: float32
- name: '1655'
dtype: float32
- name: '1656'
dtype: float32
- name: '1657'
dtype: float32
- name: '1658'
dtype: float32
- name: '1659'
dtype: float32
- name: '1660'
dtype: float32
- name: '1661'
dtype: float32
- name: '1662'
dtype: float32
- name: '1663'
dtype: float32
- name: '1664'
dtype: float32
- name: '1665'
dtype: float32
- name: '1666'
dtype: float32
- name: '1667'
dtype: float32
- name: '1668'
dtype: float32
- name: '1669'
dtype: float32
- name: '1670'
dtype: float32
- name: '1671'
dtype: float32
- name: '1672'
dtype: float32
- name: '1673'
dtype: float32
- name: '1674'
dtype: float32
- name: '1675'
dtype: float32
- name: '1676'
dtype: float32
- name: '1677'
dtype: float32
- name: '1678'
dtype: float32
- name: '1679'
dtype: float32
- name: '1680'
dtype: float32
- name: '1681'
dtype: float32
- name: '1682'
dtype: float32
- name: '1683'
dtype: float32
- name: '1684'
dtype: float32
- name: '1685'
dtype: float32
- name: '1686'
dtype: float32
- name: '1687'
dtype: float32
- name: '1688'
dtype: float32
- name: '1689'
dtype: float32
- name: '1690'
dtype: float32
- name: '1691'
dtype: float32
- name: '1692'
dtype: float32
- name: '1693'
dtype: float32
- name: '1694'
dtype: float32
- name: '1695'
dtype: float32
- name: '1696'
dtype: float32
- name: '1697'
dtype: float32
- name: '1698'
dtype: float32
- name: '1699'
dtype: float32
- name: '1700'
dtype: float32
- name: '1701'
dtype: float32
- name: '1702'
dtype: float32
- name: '1703'
dtype: float32
- name: '1704'
dtype: float32
- name: '1705'
dtype: float32
- name: '1706'
dtype: float32
- name: '1707'
dtype: float32
- name: '1708'
dtype: float32
- name: '1709'
dtype: float32
- name: '1710'
dtype: float32
- name: '1711'
dtype: float32
- name: '1712'
dtype: float32
- name: '1713'
dtype: float32
- name: '1714'
dtype: float32
- name: '1715'
dtype: float32
- name: '1716'
dtype: float32
- name: '1717'
dtype: float32
- name: '1718'
dtype: float32
- name: '1719'
dtype: float32
- name: '1720'
dtype: float32
- name: '1721'
dtype: float32
- name: '1722'
dtype: float32
- name: '1723'
dtype: float32
- name: '1724'
dtype: float32
- name: '1725'
dtype: float32
- name: '1726'
dtype: float32
- name: '1727'
dtype: float32
- name: '1728'
dtype: float32
- name: '1729'
dtype: float32
- name: '1730'
dtype: float32
- name: '1731'
dtype: float32
- name: '1732'
dtype: float32
- name: '1733'
dtype: float32
- name: '1734'
dtype: float32
- name: '1735'
dtype: float32
- name: '1736'
dtype: float32
- name: '1737'
dtype: float32
- name: '1738'
dtype: float32
- name: '1739'
dtype: float32
- name: '1740'
dtype: float32
- name: '1741'
dtype: float32
- name: '1742'
dtype: float32
- name: '1743'
dtype: float32
- name: '1744'
dtype: float32
- name: '1745'
dtype: float32
- name: '1746'
dtype: float32
- name: '1747'
dtype: float32
- name: '1748'
dtype: float32
- name: '1749'
dtype: float32
- name: '1750'
dtype: float32
- name: '1751'
dtype: float32
- name: '1752'
dtype: float32
- name: '1753'
dtype: float32
- name: '1754'
dtype: float32
- name: '1755'
dtype: float32
- name: '1756'
dtype: float32
- name: '1757'
dtype: float32
- name: '1758'
dtype: float32
- name: '1759'
dtype: float32
- name: '1760'
dtype: float32
- name: '1761'
dtype: float32
- name: '1762'
dtype: float32
- name: '1763'
dtype: float32
- name: '1764'
dtype: float32
- name: '1765'
dtype: float32
- name: '1766'
dtype: float32
- name: '1767'
dtype: float32
- name: '1768'
dtype: float32
- name: '1769'
dtype: float32
- name: '1770'
dtype: float32
- name: '1771'
dtype: float32
- name: '1772'
dtype: float32
- name: '1773'
dtype: float32
- name: '1774'
dtype: float32
- name: '1775'
dtype: float32
- name: '1776'
dtype: float32
- name: '1777'
dtype: float32
- name: '1778'
dtype: float32
- name: '1779'
dtype: float32
- name: '1780'
dtype: float32
- name: '1781'
dtype: float32
- name: '1782'
dtype: float32
- name: '1783'
dtype: float32
- name: '1784'
dtype: float32
- name: '1785'
dtype: float32
- name: '1786'
dtype: float32
- name: '1787'
dtype: float32
- name: '1788'
dtype: float32
- name: '1789'
dtype: float32
- name: '1790'
dtype: float32
- name: '1791'
dtype: float32
- name: '1792'
dtype: float32
- name: '1793'
dtype: float32
- name: '1794'
dtype: float32
- name: '1795'
dtype: float32
- name: '1796'
dtype: float32
- name: '1797'
dtype: float32
- name: '1798'
dtype: float32
- name: '1799'
dtype: float32
- name: '1800'
dtype: float32
- name: '1801'
dtype: float32
- name: '1802'
dtype: float32
- name: '1803'
dtype: float32
- name: '1804'
dtype: float32
- name: '1805'
dtype: float32
- name: '1806'
dtype: float32
- name: '1807'
dtype: float32
- name: '1808'
dtype: float32
- name: '1809'
dtype: float32
- name: '1810'
dtype: float32
- name: '1811'
dtype: float32
- name: '1812'
dtype: float32
- name: '1813'
dtype: float32
- name: '1814'
dtype: float32
- name: '1815'
dtype: float32
- name: '1816'
dtype: float32
- name: '1817'
dtype: float32
- name: '1818'
dtype: float32
- name: '1819'
dtype: float32
- name: '1820'
dtype: float32
- name: '1821'
dtype: float32
- name: '1822'
dtype: float32
- name: '1823'
dtype: float32
- name: '1824'
dtype: float32
- name: '1825'
dtype: float32
- name: '1826'
dtype: float32
- name: '1827'
dtype: float32
- name: '1828'
dtype: float32
- name: '1829'
dtype: float32
- name: '1830'
dtype: float32
- name: '1831'
dtype: float32
- name: '1832'
dtype: float32
- name: '1833'
dtype: float32
- name: '1834'
dtype: float32
- name: '1835'
dtype: float32
- name: '1836'
dtype: float32
- name: '1837'
dtype: float32
- name: '1838'
dtype: float32
- name: '1839'
dtype: float32
- name: '1840'
dtype: float32
- name: '1841'
dtype: float32
- name: '1842'
dtype: float32
- name: '1843'
dtype: float32
- name: '1844'
dtype: float32
- name: '1845'
dtype: float32
- name: '1846'
dtype: float32
- name: '1847'
dtype: float32
- name: '1848'
dtype: float32
- name: '1849'
dtype: float32
- name: '1850'
dtype: float32
- name: '1851'
dtype: float32
- name: '1852'
dtype: float32
- name: '1853'
dtype: float32
- name: '1854'
dtype: float32
- name: '1855'
dtype: float32
- name: '1856'
dtype: float32
- name: '1857'
dtype: float32
- name: '1858'
dtype: float32
- name: '1859'
dtype: float32
- name: '1860'
dtype: float32
- name: '1861'
dtype: float32
- name: '1862'
dtype: float32
- name: '1863'
dtype: float32
- name: '1864'
dtype: float32
- name: '1865'
dtype: float32
- name: '1866'
dtype: float32
- name: '1867'
dtype: float32
- name: '1868'
dtype: float32
- name: '1869'
dtype: float32
- name: '1870'
dtype: float32
- name: '1871'
dtype: float32
- name: '1872'
dtype: float32
- name: '1873'
dtype: float32
- name: '1874'
dtype: float32
- name: '1875'
dtype: float32
- name: '1876'
dtype: float32
- name: '1877'
dtype: float32
- name: '1878'
dtype: float32
- name: '1879'
dtype: float32
- name: '1880'
dtype: float32
- name: '1881'
dtype: float32
- name: '1882'
dtype: float32
- name: '1883'
dtype: float32
- name: '1884'
dtype: float32
- name: '1885'
dtype: float32
- name: '1886'
dtype: float32
- name: '1887'
dtype: float32
- name: '1888'
dtype: float32
- name: '1889'
dtype: float32
- name: '1890'
dtype: float32
- name: '1891'
dtype: float32
- name: '1892'
dtype: float32
- name: '1893'
dtype: float32
- name: '1894'
dtype: float32
- name: '1895'
dtype: float32
- name: '1896'
dtype: float32
- name: '1897'
dtype: float32
- name: '1898'
dtype: float32
- name: '1899'
dtype: float32
- name: '1900'
dtype: float32
- name: '1901'
dtype: float32
- name: '1902'
dtype: float32
- name: '1903'
dtype: float32
- name: '1904'
dtype: float32
- name: '1905'
dtype: float32
- name: '1906'
dtype: float32
- name: '1907'
dtype: float32
- name: '1908'
dtype: float32
- name: '1909'
dtype: float32
- name: '1910'
dtype: float32
- name: '1911'
dtype: float32
- name: '1912'
dtype: float32
- name: '1913'
dtype: float32
- name: '1914'
dtype: float32
- name: '1915'
dtype: float32
- name: '1916'
dtype: float32
- name: '1917'
dtype: float32
- name: '1918'
dtype: float32
- name: '1919'
dtype: float32
- name: '1920'
dtype: float32
- name: '1921'
dtype: float32
- name: '1922'
dtype: float32
- name: '1923'
dtype: float32
- name: '1924'
dtype: float32
- name: '1925'
dtype: float32
- name: '1926'
dtype: float32
- name: '1927'
dtype: float32
- name: '1928'
dtype: float32
- name: '1929'
dtype: float32
- name: '1930'
dtype: float32
- name: '1931'
dtype: float32
- name: '1932'
dtype: float32
- name: '1933'
dtype: float32
- name: '1934'
dtype: float32
- name: '1935'
dtype: float32
- name: '1936'
dtype: float32
- name: '1937'
dtype: float32
- name: '1938'
dtype: float32
- name: '1939'
dtype: float32
- name: '1940'
dtype: float32
- name: '1941'
dtype: float32
- name: '1942'
dtype: float32
- name: '1943'
dtype: float32
- name: '1944'
dtype: float32
- name: '1945'
dtype: float32
- name: '1946'
dtype: float32
- name: '1947'
dtype: float32
- name: '1948'
dtype: float32
- name: '1949'
dtype: float32
- name: '1950'
dtype: float32
- name: '1951'
dtype: float32
- name: '1952'
dtype: float32
- name: '1953'
dtype: float32
- name: '1954'
dtype: float32
- name: '1955'
dtype: float32
- name: '1956'
dtype: float32
- name: '1957'
dtype: float32
- name: '1958'
dtype: float32
- name: '1959'
dtype: float32
- name: '1960'
dtype: float32
- name: '1961'
dtype: float32
- name: '1962'
dtype: float32
- name: '1963'
dtype: float32
- name: '1964'
dtype: float32
- name: '1965'
dtype: float32
- name: '1966'
dtype: float32
- name: '1967'
dtype: float32
- name: '1968'
dtype: float32
- name: '1969'
dtype: float32
- name: '1970'
dtype: float32
- name: '1971'
dtype: float32
- name: '1972'
dtype: float32
- name: '1973'
dtype: float32
- name: '1974'
dtype: float32
- name: '1975'
dtype: float32
- name: '1976'
dtype: float32
- name: '1977'
dtype: float32
- name: '1978'
dtype: float32
- name: '1979'
dtype: float32
- name: '1980'
dtype: float32
- name: '1981'
dtype: float32
- name: '1982'
dtype: float32
- name: '1983'
dtype: float32
- name: '1984'
dtype: float32
- name: '1985'
dtype: float32
- name: '1986'
dtype: float32
- name: '1987'
dtype: float32
- name: '1988'
dtype: float32
- name: '1989'
dtype: float32
- name: '1990'
dtype: float32
- name: '1991'
dtype: float32
- name: '1992'
dtype: float32
- name: '1993'
dtype: float32
- name: '1994'
dtype: float32
- name: '1995'
dtype: float32
- name: '1996'
dtype: float32
- name: '1997'
dtype: float32
- name: '1998'
dtype: float32
- name: '1999'
dtype: float32
- name: '2000'
dtype: float32
- name: '2001'
dtype: float32
- name: '2002'
dtype: float32
- name: '2003'
dtype: float32
- name: '2004'
dtype: float32
- name: '2005'
dtype: float32
- name: '2006'
dtype: float32
- name: '2007'
dtype: float32
- name: '2008'
dtype: float32
- name: '2009'
dtype: float32
- name: '2010'
dtype: float32
- name: '2011'
dtype: float32
- name: '2012'
dtype: float32
- name: '2013'
dtype: float32
- name: '2014'
dtype: float32
- name: '2015'
dtype: float32
- name: '2016'
dtype: float32
- name: '2017'
dtype: float32
- name: '2018'
dtype: float32
- name: '2019'
dtype: float32
- name: '2020'
dtype: float32
- name: '2021'
dtype: float32
- name: '2022'
dtype: float32
- name: '2023'
dtype: float32
- name: '2024'
dtype: float32
- name: '2025'
dtype: float32
- name: '2026'
dtype: float32
- name: '2027'
dtype: float32
- name: '2028'
dtype: float32
- name: '2029'
dtype: float32
- name: '2030'
dtype: float32
- name: '2031'
dtype: float32
- name: '2032'
dtype: float32
- name: '2033'
dtype: float32
- name: '2034'
dtype: float32
- name: '2035'
dtype: float32
- name: '2036'
dtype: float32
- name: '2037'
dtype: float32
- name: '2038'
dtype: float32
- name: '2039'
dtype: float32
- name: '2040'
dtype: float32
- name: '2041'
dtype: float32
- name: '2042'
dtype: float32
- name: '2043'
dtype: float32
- name: '2044'
dtype: float32
- name: '2045'
dtype: float32
- name: '2046'
dtype: float32
- name: '2047'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 307621178.4375
num_examples: 37500
- name: test
num_bytes: 102540392.5
num_examples: 12500
download_size: 565362778
dataset_size: 410161570.9375
---
# Dataset Card for "CSIC_GPTNEO_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM-submissions/ratishsp__ent__1649421332 | ---
benchmark: gem
type: prediction
submission_name: ENT
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: ENT
|
yzeng58/CoBSAT | ---
license: mit
task_categories:
- text-to-image
language:
- en
tags:
- MLLM
- in-context learning
- text-to-image generation
- T2I-ICL
- ICL
- NLP
- natural language processing
pretty_name: CoBSAT
size_categories:
- 1K<n<10K
---
**Dataset**: The CoBSAT benchmark evaluates the ability of MLLMs to perform T2I-ICL. It covers five themes: color, background, style, action, and texture, each with two different emphases: object-inference and attribute-inference. Here, we visualize the images and their corresponding labels and captions collected for our dataset. We further integrate the images and their labels for constructing the prompts for text-to-image in-context learning using the processing code provided in https://github.com/UW-Madison-Lee-Lab/CoBSAT.
**Paper Link**: https://arxiv.org/abs/2402.01293
```tex
@article{zeng2024can,
title={Can MLLMs Perform Text-to-Image In-Context Learning?},
author={Zeng, Yuchen and Kang, Wonjun and Chen, Yicong and Koo, Hyung Il and Lee, Kangwook},
journal={arXiv preprint arXiv:2402.01293},
year={2024}
}
```
|
khoomeik/gzipscale-0.42-100M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 349975383
num_examples: 390625
download_size: 86936688
dataset_size: 349975383
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nikutka/L1_scraped_korpus_wzorcowy_train | ---
dataset_info:
features:
- name: content
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 4838134
num_examples: 29488
download_size: 3466828
dataset_size: 4838134
---
# Dataset Card for "L1_scraped_korpus_wzorcowy_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yurinoviello/mmarco-corpus | ---
dataset_info:
features:
- name: id
dtype: int32
- name: text
dtype: string
splits:
- name: dev.full
num_bytes: 61428760
num_examples: 159096
- name: dev
num_bytes: 10627343
num_examples: 27433
download_size: 46482920
dataset_size: 72056103
configs:
- config_name: default
data_files:
- split: dev.full
path: data/dev.full-*
- split: dev
path: data/dev-*
---
|
lexklima/belmarq | ---
license: openrail
---
|
tyzhu/find_first_sent_train_400_eval_40_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 1368863
num_examples: 840
- name: validation
num_bytes: 71989
num_examples: 40
download_size: 536857
dataset_size: 1440852
---
# Dataset Card for "find_first_sent_train_400_eval_40_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/OK-VQA_test_google_flan_t5_xxl_mode_T_A_C_Q_rices_ns_5046 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 5458212
num_examples: 5046
- name: fewshot_0_clip_tags_ViT_L_14_with_openai_Attributes_ViT_L_14_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random_text
num_bytes: 5814379
num_examples: 5046
download_size: 2675960
dataset_size: 11272591
---
# Dataset Card for "OK-VQA_test_google_flan_t5_xxl_mode_T_A_C_Q_rices_ns_5046"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-131000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1015414
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
somosnlp/NoticIA-it | ---
language:
- es
size_categories:
- n<1K
task_categories:
- summarization
pretty_name: Resumen Noticias Clickbait
dataset_info:
features:
- name: id
dtype: int64
- name: titular
dtype: string
- name: respuesta
dtype: string
- name: pregunta
dtype: string
- name: texto
dtype: string
- name: idioma
dtype: string
- name: periodo
dtype: string
- name: tarea
dtype: string
splits:
- name: train
num_bytes: 5408185
num_examples: 700
- name: validation
num_bytes: 460068
num_examples: 50
- name: test
num_bytes: 777835
num_examples: 100
download_size: 3411307
dataset_size: 6646088
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
tags:
- summarization
- clickbait
- news
---
<p align="center">
<img src="https://huggingface.co/datasets/Iker/NoticIA/resolve/main/assets/logo.png" style="width: 50%;">
</p>
<h1 align="center">NoticIA: Un Dataset para el Resumen de Artículos Clickbait en Español.</h1>
Definimos un artículo clickbait como un artículo que busca atraer la atención del lector a través de la curiosidad.
Para ello, el titular plantea una pregunta o una afirmación incompleta, sansacionalista, exagerada o engañosa.
La respuesta a la pregunta generada en el titular, no suele aparecer hasta el final del artículo, la cual es precedida por una gran cantidad de contenido irrelevante.
El objetivo es que el usuario entre en la web a través del titular y después haga scroll hasta el final del artículo haciéndole ver la mayor cantidad de publicidad posible.
Los artículos clickbait suelen ser de baja calidad y no aportan valor al lector, más allá de la curiosidad inicial. Este fenómeno hace socavar la confianza del público en las fuentes de noticias.
Y afecta negativamente a los ingresos publicitarios de los creadores de contenidos legítimos, que podrían ver reducido su tráfico web.
Presentamos NoticIA, un conjunto de datos que consta de 850 artículos de noticias en español con titulares clickbait,
cada uno emparejado con resúmenes generativos de alta calidad de una sola frase escritos por humanos.
Esta tarea exige habilidades avanzadas de comprensión y resumen de texto, desafiando la capacidad de los modelos para inferir y
conectar diversas piezas de información para satisfacer la curiosidad informativa del usuario generada por el titular clickbait.
El proyecto está inspirado la cuenta de X/Twitter [@ahorrandoclick1](https://x.com/ahorrandoclick1).
[@ahorrandoclick1](https://x.com/ahorrandoclick1) cuenta con 300.000 seguidores, lo que demuestra el gran valor de realizar resúmenes de noticias clickbait.
Sin embargo, realizar estos resúmenes a mano, es una tarea muy laboriosa, y el número de noticias clickbait publicadas supera ampliante el número de resúmenes que
una persona puede realizar. Por lo tanto, existe la necesidad de generar resúmenes automáticos de noticias clickbait. Además, como hemos mencionado anteriormente, se
trata de una tarea ideal para analizar las capacidades de compresión de texto en español de un modelo de lenguaje.
# Ejemplos de Noticias Clickbait
La siguiente imágen muestra algunas noticias Clickbait extraídas de nuestro dataset. Como se puede ver, los titulares son altamente sensacionalistas, prometiendo al usuario una información que no cumple las expectivas, o que en algunos casos, ni siquiera existe.
Estos artículos no cumplen ninguna función informatica, y su único objetivo es generar ingresos publicitarios con los lectores que se ven atraídos por un titular engañoso.
<p align="center">
<img src="https://raw.githubusercontent.com/ikergarcia1996/NoticIA/main/assets/examples.png" style="width: 100%;">
</p>
# Recopilación de Noticias Clickbait
Hemos recopilado noticias clickbait usando la timeline del usuario de X/Twitter [@ahorrandoclick1](https://x.com/ahorrandoclick1). Para ello, hemos extraído
las url de las noticias mencionadas por el usuario. Además, hemos añadido aproximadamente 100 noticias clibait escogidas por nosotros. La siguiente imágen, muestra
la fuente de las noticias del dataset.
<p align="center">
<img src="https://raw.githubusercontent.com/ikergarcia1996/NoticIA/main/assets/noticia_dataset.png" style="width: 50%;">
</p>
Hemos clasificado cada una de las noticias en base a la categoría a la que pertenecen. Como se puede observar, nuestro dataset incluye una gran variedad de categorías.
<p align="center">
<img src="https://raw.githubusercontent.com/ikergarcia1996/NoticIA/main/assets/categories_distribution_spanish.png" style="width: 50%;">
</p>
# Anotación del dataset
Aunque [@ahorrandoclick1](https://x.com/ahorrandoclick1) reliza resúmenes de las noticias clickbat, estos resúmenes no siguen unas guidelines, y en muchos casos,
su resumen no hace referencia al texto, si no que son del estilo *"Esto es publicidad"*, *"Aún no se han enterado de que..."*. Por lo tanto, hemos generado
a mano el resumen de las 850 noticias. Para ello, hemos definido unas guidelines de anotación estrictas, disponibles en el siguiente enlace: [https://huggingface.co/spaces/Iker/ClickbaitAnnotation/blob/main/guidelines.py](https://huggingface.co/spaces/Iker/ClickbaitAnnotation/blob/main/guidelines.py).
El dataset ha sido anotado por [Iker García-Ferrero](https://ikergarcia1996.github.io/Iker-Garcia-Ferrero/) y [Begoña Altuna](https://www.linkedin.com/in/bego%C3%B1a-altuna-78014139), en este proceso se han invertido aproximadamente 40 horas.
# Estadísticas del dataset
Hemos dividido el dataset en tres splits, lo que facilita el entrenamiento de modelos. Como se puede ver en la siguiente tabla, los resúmenes de las noticias son extremadamente concisos.
Responden al titulat clcikbait de usando el menor número de palabras posibles.
| | Train | Validation | Test | Total |
|--------------------|-------|-----|------|-------|
| Número de artículos | 700 | 50 | 100 | 850 |
| Número medio de palabras en los titulates | 16 | 17 | 17 | 17 |
| Número medio de palabras del texto de la noticia | 544 | 663 | 549 | 552 |
| Número medio de palabras en los resúmenes | 12 | 11 | 11 | 12 |
# Validación de las anotaciones
Para validar el dataset, los 100 resúmenes del conjunto de Test han sido anotados por dos anotadores.
La concordancia general entre los anotadores ha sido alta, ya que han proporcionado exactamente la misma respuesta en el 26\% de los casos y han proporcionado respuestas que comparten parcialmente la información en el 48\% de los casos (misma respuesta, pero con alguna variación en las palabras utilizadas).
Esto demuestra que a los humanos les resultó fácil encontrar la información a la que se refiere el titular. También hemos identificado una lista de casos en los que los anotadores han ofrecido respuestas diferentes pero igualmente válidas, lo que constituye el 18\% de los casos.
Por último, identificamos 8 casos de desacuerdo. En 3 casos, uno de los anotadores realizó un resumen incorrecto,
probablemente debido al cansancio tras anotar múltiples ejemplos. En los 5 casos restantes, el desacuerdo se debió a información contradictoria en el artículo y
a diferentes interpretaciones de esta información. En estos casos, la determinación del resumen correcto queda sujeto a la interpretación del lector.
En cuanto a la evaluación de las guidelines, en general, no eran ambiguas, aunque que la petición de seleccionar la cantidad mínima de palabras para generar un
resumen válido a veces no es interpretada de la misma forma por los anotadores: Por ejemplo, la extensión mínima podría entenderse como el enfoque de la pregunta en el titular o una frase mínima bien formada.
En breves publicaremos un artículo con un análisis más detallado. Las anotaciones escritas por cada anotador pueden comprobarse en el siguiente enlace: [https://huggingface.co/datasets/Iker/NoticIA_Human_Validation](https://huggingface.co/datasets/Iker/NoticIA_Human_Validation).
# Formato de los datos
El dataset se encuentra listo para ser usado para evaluar modelos de lenguaje. Para ellos, hemos desarrollado un *prompt* que hace uso del titular de la noticia y el texto.
El prompt es el siguiente:
```python
def clickbait_prompt(
headline: str,
body: str,
) -> str:
"""
Generate the prompt for the model.
Args:
headline (`str`):
The headline of the article.
body (`str`):
The body of the article.
Returns:
`str`: The formatted prompt.
"""
return (
f"Ahora eres una Inteligencia Artificial experta en desmontar titulares sensacionalistas o clickbait. "
f"Tu tarea consiste en analizar noticias con titulares sensacionalistas y "
f"generar un resumen de una sola frase que revele la verdad detrás del titular.\n"
f"Este es el titular de la noticia: {headline}\n"
f"El titular plantea una pregunta o proporciona información incompleta. "
f"Debes buscar en el cuerpo de la noticia una frase que responda lo que se sugiere en el título. "
f"Responde siempre que puedas parafraseando el texto original. "
f"Usa siempre las mínimas palabras posibles. "
f"Recuerda responder siempre en Español.\n"
f"Este es el cuerpo de la noticia:\n"
f"{body}\n"
)
```
El output experado del modelo es el resúmen. A continuación, se muestra un ejemplo de como evaluar `gemma-2b` en nuestro dataset:
```
from transformers import pipeline
from datasets import load_dataset
generator = pipeline(model="google/gemma-2b-it",device_map="auto")
dataset = load_dataset("somosnlp/NoticIA-it",split="test")
outputs = generator(dataset[0]["prompt"], return_full_text=False,max_length=4096)
print(outputs)
```
El dataset incluye los siguientes campos:
- **ID**: id del ejemplo
- **titular**: Titular del artículo
- **respuesta**: Resumen escrito por un humano
- **pregunta**: Prompt listo para servir de input a un modelo de lenguaje
- **texto**: Texto del artículo, obtenido del HTML.
# Evaluación masiva de Modelos de Lenguaje
Como es habitual en las tareas de resumen, utilizamos la métrica de puntuación ROUGE para evaluar automáticamente los resúmenes producidos por los modelos.
Nuestra métrica principal es ROUGE-1, que considera las palabras enteras como unidades básicas. Para calcular la puntuación ROUGE, ponemos en minúsculas ambos resúmenes y eliminamos los signos de puntuación.
Además de la puntuación ROUGE, también tenemos en cuenta la longitud media de los resúmenes.
Para nuestra tarea, pretendemos que los resúmenes sean concisos, un aspecto que la puntuación ROUGE no evalúa. Por lo tanto, al evaluar los modelos tenemos en cuenta tanto la puntuación ROUGE-1 como la longitud media de los resúmenes. Nuestro objetivo es encontrar un modelo que consiga la mayor puntuación ROUGE posible con la menor longitud de resumen posible, equilibrando calidad y brevedad.
Hemos realizado una evaluación incluyendo los mejores modelos de lenguaje entrenados para seguir instrucciones actuales. Hemos usado el prompt definido previamente. El prompt es convertido al template de chat específico de cada modelo.
El código para reproducir los resultados se encuentra en el siguiente enlace: [https://github.com/ikergarcia1996/NoticIA](https://github.com/ikergarcia1996/NoticIA)
<p align="center">
<img src="https://huggingface.co/datasets/somosnlp/Resumen_Noticias_Clickbait/resolve/main/Results_zero.png" style="width: 100%;">
</p>
# Usos del dataset
Este dataset ha sido recopilado para su uso en investigación científica. Concretamente, para su uso en la evaluación de modelos de lenguaje en Español.
El uso comercial de este dataset está supedidado a las licencias de cada noticia y medio. Si quieres hacer un uso comercial del dataset tendrás que tener
el permiso expreso de los medios de los cuales han sido obtenidas las noticias. Prohibimos expresamente el uso de estos datos para dos casos de uso que consideramos
que pueden ser perjudiciales: El entrenamiento de modelos que generen titulares sensacionalistas o clickbait, y el entrenamiento de modelos que generen artículos o noticias de forma automática.
# Dataset Description
- **Author:** [Iker García-Ferrero](https://ikergarcia1996.github.io/Iker-Garcia-Ferrero/)
- **Author** [Begoña Altuna](https://www.linkedin.com/in/bego%C3%B1a-altuna-78014139)
- **Web Page**: [Github](https://github.com/ikergarcia1996/NoticIA)
- **Language(s) (NLP):** Spanish
# Autores
Este dataset ha sido creado por [Iker García-Ferrero](https://ikergarcia1996.github.io/Iker-Garcia-Ferrero/) y [Begoña Altuna](https://www.linkedin.com/in/bego%C3%B1a-altuna-78014139).
Somos investigadores en PLN en la Universidad del País Vasco, dentro del grupo de investigación [IXA](https://www.ixa.eus/) y formamos parte de [HiTZ, el Centro Vasco de Tecnología de la Lengua](https://www.hitz.eus/es).
<div style="display: flex; justify-content: space-around; width: 100%;">
<div style="width: 50%;" align="left">
<a href="http://ixa.si.ehu.es/">
<img src="https://raw.githubusercontent.com/ikergarcia1996/Iker-Garcia-Ferrero/master/icons/ixa.png" width="50" height="50" alt="Ixa NLP Group">
</a>
</div>
<div style="width: 50%;" align="right">
<a href="http://www.hitz.eus/">
<img src="https://raw.githubusercontent.com/ikergarcia1996/Iker-Garcia-Ferrero/master/icons/Hitz.png" width="300" height="50" alt="HiTZ Basque Center for Language Technologies">
</a>
</div>
</div> |
open-llm-leaderboard/details_TaylorAI__Flash-Llama-30M-20001 | ---
pretty_name: Evaluation run of TaylorAI/Flash-Llama-30M-20001
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TaylorAI/Flash-Llama-30M-20001](https://huggingface.co/TaylorAI/Flash-Llama-30M-20001)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TaylorAI__Flash-Llama-30M-20001\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T02:44:26.412393](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__Flash-Llama-30M-20001/blob/main/results_2023-09-17T02-44-26.412393.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002307046979865772,\n\
\ \"em_stderr\": 0.0004913221265094458,\n \"f1\": 0.006848783557046977,\n\
\ \"f1_stderr\": 0.0006387737069456149,\n \"acc\": 0.2541436464088398,\n\
\ \"acc_stderr\": 0.007025277661412097\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002307046979865772,\n \"em_stderr\": 0.0004913221265094458,\n\
\ \"f1\": 0.006848783557046977,\n \"f1_stderr\": 0.0006387737069456149\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5082872928176796,\n\
\ \"acc_stderr\": 0.014050555322824194\n }\n}\n```"
repo_url: https://huggingface.co/TaylorAI/Flash-Llama-30M-20001
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|arc:challenge|25_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T02_44_26.412393
path:
- '**/details_harness|drop|3_2023-09-17T02-44-26.412393.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T02-44-26.412393.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T02_44_26.412393
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-44-26.412393.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T02-44-26.412393.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hellaswag|10_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T09-53-56.209295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-06T09-53-56.209295.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-06T09-53-56.209295.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T02_44_26.412393
path:
- '**/details_harness|winogrande|5_2023-09-17T02-44-26.412393.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T02-44-26.412393.parquet'
- config_name: results
data_files:
- split: 2023_09_06T09_53_56.209295
path:
- results_2023-09-06T09-53-56.209295.parquet
- split: 2023_09_17T02_44_26.412393
path:
- results_2023-09-17T02-44-26.412393.parquet
- split: latest
path:
- results_2023-09-17T02-44-26.412393.parquet
---
# Dataset Card for Evaluation run of TaylorAI/Flash-Llama-30M-20001
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TaylorAI/Flash-Llama-30M-20001
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TaylorAI/Flash-Llama-30M-20001](https://huggingface.co/TaylorAI/Flash-Llama-30M-20001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TaylorAI__Flash-Llama-30M-20001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T02:44:26.412393](https://huggingface.co/datasets/open-llm-leaderboard/details_TaylorAI__Flash-Llama-30M-20001/blob/main/results_2023-09-17T02-44-26.412393.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094458,
"f1": 0.006848783557046977,
"f1_stderr": 0.0006387737069456149,
"acc": 0.2541436464088398,
"acc_stderr": 0.007025277661412097
},
"harness|drop|3": {
"em": 0.002307046979865772,
"em_stderr": 0.0004913221265094458,
"f1": 0.006848783557046977,
"f1_stderr": 0.0006387737069456149
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5082872928176796,
"acc_stderr": 0.014050555322824194
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
RepoFusion/Stack-Repo | ---
license: other
---
# Summary of the Dataset
## Description
Stack-Repo is a dataset of 200 Java repositories from GitHub with permissive licenses and near-deduplicated files that are augmented with three types of repository contexts.
- Prompt Proposal (PP) Contexts: These contexts are based on the prompt proposals from the paper [Repository-Level Prompt Generation for Large Language Models of Code](https://arxiv.org/abs/2206.12839).
- BM25 Contexts: These contexts are obtained based on the BM25 similarity scores.
- RandomNN Contexts: These contexts are obtained using the nearest neighbors in the representation space of an embedding model.
For more details, please check our paper [RepoFusion: Training Code Models to Understand Your Repository](https://arxiv.org/abs/2306.10998).
The original Java source files are obtained using a [modified version](https://huggingface.co/datasets/bigcode/the-stack-dedup) of [The Stack](https://huggingface.co/datasets/bigcode/the-stack).
## Data Splits
The dataset consists of three splits: `train`, `validation` and `test`, comprising of 100, 50, and 50 repositories, respectively.
## Data Organization
Each split contains separate folder for a repository where each repository contains all `.java` source code files in the repository in the original directory structure along with three `.json` files corresponding to the PP, BM25 and RandomNN repo contexts. In terms of the HuggingFace Datasets terminology, we have four subdatasets or configurations.
- `PP_contexts`: Propmt Proposal repo contexts.
- `bm25_contexts`: BM25 repo contexts.
- `randomNN_contexts`: RandomNN repo contexts.
- `sources`: actual java (`.java`) source code files
# Dataset Usage
To clone the dataset locally
```
git clone https://huggingface.co/datasets/RepoFusion/Stack-Repo <local_path>
```
To load the dataset desired configuration and split:
```python
import datasets
ds = datasets.load_dataset(
"RepoFusion/Stack-Repo",
name="<configuration_name>",
split="<split_name>"
data_dir="<local_path>"
)
```
NOTE: The configurations for the repo contexts `bm25_contexts`, `PP_contexts` and `randomNN_contexts` can be loaded directly by specifying the corresponding
`<configuration_name>` along with the `<split_name>` in the load_dataset command listed above without cloning the repo locally.
For the `sources` if not cloned beforehand or `data_dir` not specified, `ManualDownloadError` will be raised.
## Data Format
The expected data format of the `.json` files is a list of target holes and corresponding repo contexts where each entry in the `.json` file corresponds to a target hole consisting of the location of the target hole, the target hole as a string, the surrounding context as a string and a list of repo-contexts as strings. Specifically, each row is a dictionary containing
- `id`: hole_id (location of the target hole)
- `question`: surrounding context
- `target`: target hole
- `ctxs`: a list of repo contexts where each item is a dictionary containing
- `title`: name of the repo context
- `text`: content of the repo context
The actual java sources can be accessed via file system directly. The format is like this `[<data_set_root>/data/<split_name>/<github_user>/<repo_name>/<path/to/every/java/file/in/the/repo>.java]`. When accessed through `Datasets.load_dataset`, the data fields for the `sources` can be specified as below.
```python
features = datasets.Features({
'file': datasets.Value('string'),
'content': datasets.Value('string')
})
```
When accessed through `Datasets.load_dataset`, the data fields for the repo contexts can be specified as below.
```python
features = datasets.Features({
'id': datasets.Value('string'),
'hole_file': datasets.Value('string'),
'hole_line': datasets.Value('int32'),
'hole_pos': datasets.Value('int32'),
'question': datasets.Value('string'),
'target': datasets.Value('string'),
'answers': datasets.Sequence(
datasets.Value('string')
),
'ctxs': [{
'title': datasets.Value('string'),
'text': datasets.Value('string'),
'score': datasets.Value('float64')
}]
})
```
# Additional Information
## Dataset Curators
- Disha Shrivastava, dishu.905@gmail.com
- Denis Kocetkov, denis.kocetkov@servicenow.com
## Licensing Information
Stack-Repo is derived from a modified version of The Stack. The Stack is a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in The Stack must abide by the terms of the original licenses, including attribution clauses when relevant. We facilitate this by providing provenance information for each data point.
The list of [SPDX license identifiers](https://spdx.org/licenses/) included in the dataset can be found [here](https://huggingface.co/datasets/bigcode/the-stack-dedup/blob/main/licenses.json).
## Citation
```
@article{shrivastava2023repofusion,
title={RepoFusion: Training Code Models to Understand Your Repository},
author={Shrivastava, Disha and Kocetkov, Denis and de Vries, Harm and Bahdanau, Dzmitry and Scholak, Torsten},
journal={arXiv preprint arXiv:2306.10998},
year={2023}
}
```
|
zolak/twitter_dataset_80_1713061598 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3248717
num_examples: 7868
download_size: 1647115
dataset_size: 3248717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-1abd3a-16146235 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: launch/gov_report
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
pszemraj/SQuALITY-v1.3-flat | ---
license: apache-2.0
task_categories:
- text2text-generation
- summarization
language:
- en
size_categories:
- 1K<n<10K
source_datasets: pszemraj/SQuALITY-v1.3
---
# SQuALITY-v1.3-flat
A formatted/flat version of [the original](https://huggingface.co/datasets/pszemraj/SQuALITY-v1.3)
--- |
aherntech/spider-realistic | ---
license: cc-by-4.0
task_categories:
- text2text-generation
language:
- en
tags:
- text-to-sql
pretty_name: Spider-Rea;ostoc
size_categories:
- n<1K
---
# Dataset Card for Spider-Releastic
This dataset variant contains only the Spider Realistic dataset used in "Structure-Grounded Pretraining for Text-to-SQL". The dataset is created based on the dev split of the Spider dataset (2020-06-07 version from https://yale-lily.github.io/spider). The authors of the dataset modified the original questions to remove the explicit mention of column names while keeping the SQL queries unchanged to better evaluate the model's capability in aligning the NL utterance and the DB schema. For more details, please refer to the authors paper https://arxiv.org/abs/2010.12773. The SQL queries and databases from the original Spider dataset are kept unchanged.
For the official database files, please refer to the Spider release site: https://yale-lily.github.io/spider.
This dataset was copied from Zenodo: https://zenodo.org/records/5205322.
This dataset is distributed under the CC BY-SA 4.0 license.
## Paper Abstract
> Learning to capture text-table alignment is essential for tasks like text-to-SQL. A model needs to correctly recognize natural language references to columns and values and to ground them in the given database schema. In this paper, we present a novel weakly supervised Structure-Grounded pretraining framework (StruG) for text-to-SQL that can effectively learn to capture text-table alignment based on a parallel text-table corpus. We identify a set of novel prediction tasks: column grounding, value grounding and column-value mapping, and leverage them to pretrain a text-table encoder. Additionally, to evaluate different methods under more realistic text-table alignment settings, we create a new evaluation set Spider-Realistic based on Spider dev set with explicit mentions of column names removed, and adopt eight existing text-to-SQL datasets for cross-database evaluation. STRUG brings significant improvement over BERT-LARGE in all settings. Compared with existing pretraining methods such as GRAPPA, STRUG achieves similar performance on Spider, and outperforms all baselines on more realistic sets.
## Citation Information
If you use the dataset, please cite the following papers including the original Spider datasets, Finegan-Dollak et al., 2018 and the original datasets for Restaurants, GeoQuery, Scholar, Academic, IMDB, and Yelp.
```
@article{deng2020structure,
title={Structure-Grounded Pretraining for Text-to-SQL},
author={Deng, Xiang and Awadallah, Ahmed Hassan and Meek, Christopher and Polozov, Oleksandr and Sun, Huan and Richardson, Matthew},
journal={arXiv preprint arXiv:2010.12773},
year={2020}
}
@inproceedings{Yu&al.18c,
year = 2018,
title = {Spider: A Large-Scale Human-Labeled Dataset for Complex and Cross-Domain Semantic Parsing and Text-to-SQL Task},
booktitle = {EMNLP},
author = {Tao Yu and Rui Zhang and Kai Yang and Michihiro Yasunaga and Dongxu Wang and Zifan Li and James Ma and Irene Li and Qingning Yao and Shanelle Roman and Zilin Zhang and Dragomir Radev }
}
@InProceedings{P18-1033,
author = "Finegan-Dollak, Catherine
and Kummerfeld, Jonathan K.
and Zhang, Li
and Ramanathan, Karthik
and Sadasivam, Sesh
and Zhang, Rui
and Radev, Dragomir",
title = "Improving Text-to-SQL Evaluation Methodology",
booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
year = "2018",
publisher = "Association for Computational Linguistics",
pages = "351--360",
location = "Melbourne, Australia",
url = "http://aclweb.org/anthology/P18-1033"
}
@InProceedings{data-sql-imdb-yelp,
dataset = {IMDB and Yelp},
author = {Navid Yaghmazadeh, Yuepeng Wang, Isil Dillig, and Thomas Dillig},
title = {SQLizer: Query Synthesis from Natural Language},
booktitle = {International Conference on Object-Oriented Programming, Systems, Languages, and Applications, ACM},
month = {October},
year = {2017},
pages = {63:1--63:26},
url = {http://doi.org/10.1145/3133887},
}
@article{data-academic,
dataset = {Academic},
author = {Fei Li and H. V. Jagadish},
title = {Constructing an Interactive Natural Language Interface for Relational Databases},
journal = {Proceedings of the VLDB Endowment},
volume = {8},
number = {1},
month = {September},
year = {2014},
pages = {73--84},
url = {http://dx.doi.org/10.14778/2735461.2735468},
}
@InProceedings{data-atis-geography-scholar,
dataset = {Scholar, and Updated ATIS and Geography},
author = {Srinivasan Iyer, Ioannis Konstas, Alvin Cheung, Jayant Krishnamurthy, and Luke Zettlemoyer},
title = {Learning a Neural Semantic Parser from User Feedback},
booktitle = {Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
year = {2017},
pages = {963--973},
location = {Vancouver, Canada},
url = {http://www.aclweb.org/anthology/P17-1089},
}
@inproceedings{data-geography-original
dataset = {Geography, original},
author = {John M. Zelle and Raymond J. Mooney},
title = {Learning to Parse Database Queries Using Inductive Logic Programming},
booktitle = {Proceedings of the Thirteenth National Conference on Artificial Intelligence - Volume 2},
year = {1996},
pages = {1050--1055},
location = {Portland, Oregon},
url = {http://dl.acm.org/citation.cfm?id=1864519.1864543},
}
@inproceedings{data-restaurants-logic,
author = {Lappoon R. Tang and Raymond J. Mooney},
title = {Automated Construction of Database Interfaces: Intergrating Statistical and Relational Learning for Semantic Parsing},
booktitle = {2000 Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora},
year = {2000},
pages = {133--141},
location = {Hong Kong, China},
url = {http://www.aclweb.org/anthology/W00-1317},
}
@inproceedings{data-restaurants-original,
author = {Ana-Maria Popescu, Oren Etzioni, and Henry Kautz},
title = {Towards a Theory of Natural Language Interfaces to Databases},
booktitle = {Proceedings of the 8th International Conference on Intelligent User Interfaces},
year = {2003},
location = {Miami, Florida, USA},
pages = {149--157},
url = {http://doi.acm.org/10.1145/604045.604070},
}
@inproceedings{data-restaurants,
author = {Alessandra Giordani and Alessandro Moschitti},
title = {Automatic Generation and Reranking of SQL-derived Answers to NL Questions},
booktitle = {Proceedings of the Second International Conference on Trustworthy Eternal Systems via Evolving Software, Data and Knowledge},
year = {2012},
location = {Montpellier, France},
pages = {59--76},
url = {https://doi.org/10.1007/978-3-642-45260-4_5},
}
``` |
jtatman/medquad-medicalqa-wizdolalpaca-instruct | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: system
dtype: string
splits:
- name: train
num_bytes: 76066911.30719037
num_examples: 48452
download_size: 38181130
dataset_size: 76066911.30719037
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
##MedQA Ethics
This dataset is an amalgam of two:
- yc4142/ethics-CoT
- keivalya/MedQuad-MedicalQnADataset
...with a smattering of something close to this, but uncensored:
- nRuaif/wizard_alpaca_dolly_orca
Reasoning is thus:
- Medical datasets must be closely tied to ethics, from inception.
- Wiz/Dol/Paca/Orc lends abstract reasoning, another important aspect of a medical model - without reason, the contents are drier and less relatable.
- With the size of datasets for q/a and reason shrinking, it probably can be refined quite a bit
- Medical models must take into account tree of thought processing - nowhere is it worse to have an incorrect response than medical advice or diagnosis
|
hieule/vie-book | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7481512027
num_examples: 3620527
download_size: 3948213824
dataset_size: 7481512027
---
# Dataset Card for "vie-book"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cashu/indian-history | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1293144
num_examples: 1548
download_size: 746941
dataset_size: 1293144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T12:10:45.462405](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT/blob/main/results_2024-01-05T12-10-45.462405.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2919655714473898,\n\
\ \"acc_stderr\": 0.0318639028810806,\n \"acc_norm\": 0.2944023668702236,\n\
\ \"acc_norm_stderr\": 0.032711391877200874,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610062,\n \"mc2\": 0.4771392382771529,\n\
\ \"mc2_stderr\": 0.015567072294317703\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3779863481228669,\n \"acc_stderr\": 0.014169664520303101,\n\
\ \"acc_norm\": 0.4104095563139932,\n \"acc_norm_stderr\": 0.014374922192642666\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5435172276438957,\n\
\ \"acc_stderr\": 0.00497084669755231,\n \"acc_norm\": 0.7126070503883688,\n\
\ \"acc_norm_stderr\": 0.004516215206715344\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416545,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416545\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412417,\n\
\ \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412417\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918424,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2967741935483871,\n\
\ \"acc_stderr\": 0.0259885007924119,\n \"acc_norm\": 0.2967741935483871,\n\
\ \"acc_norm_stderr\": 0.0259885007924119\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444444,\n\
\ \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444444\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3393939393939394,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.3393939393939394,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048574,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048574\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.29743589743589743,\n \"acc_stderr\": 0.02317740813146593,\n\
\ \"acc_norm\": 0.29743589743589743,\n \"acc_norm_stderr\": 0.02317740813146593\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n\
\ \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547808,\n \"\
acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547808\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"\
acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3235294117647059,\n \"acc_stderr\": 0.032834720561085676,\n \"\
acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.032834720561085676\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.32489451476793246,\n \"acc_stderr\": 0.030486039389105303,\n \
\ \"acc_norm\": 0.32489451476793246,\n \"acc_norm_stderr\": 0.030486039389105303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.3094170403587444,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.39669421487603307,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.029996951858349472,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.029996951858349472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.35887611749680715,\n\
\ \"acc_stderr\": 0.017152991797501342,\n \"acc_norm\": 0.35887611749680715,\n\
\ \"acc_norm_stderr\": 0.017152991797501342\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.02519018132760841,\n\
\ \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.02519018132760841\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961441,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961441\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.31699346405228757,\n \"acc_stderr\": 0.02664327847450875,\n\
\ \"acc_norm\": 0.31699346405228757,\n \"acc_norm_stderr\": 0.02664327847450875\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.0266644108869376,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.0266644108869376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.32098765432098764,\n \"acc_stderr\": 0.02597656601086274,\n\
\ \"acc_norm\": 0.32098765432098764,\n \"acc_norm_stderr\": 0.02597656601086274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2796610169491525,\n\
\ \"acc_stderr\": 0.011463397393861973,\n \"acc_norm\": 0.2796610169491525,\n\
\ \"acc_norm_stderr\": 0.011463397393861973\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.028245687391462916,\n\
\ \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.028245687391462916\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2973856209150327,\n \"acc_stderr\": 0.018492596536396955,\n \
\ \"acc_norm\": 0.2973856209150327,\n \"acc_norm_stderr\": 0.018492596536396955\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3551020408163265,\n \"acc_stderr\": 0.03063565515038764,\n\
\ \"acc_norm\": 0.3551020408163265,\n \"acc_norm_stderr\": 0.03063565515038764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n\
\ \"acc_stderr\": 0.031871875379197986,\n \"acc_norm\": 0.2835820895522388,\n\
\ \"acc_norm_stderr\": 0.031871875379197986\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610062,\n \"mc2\": 0.4771392382771529,\n\
\ \"mc2_stderr\": 0.015567072294317703\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6416732438831886,\n \"acc_stderr\": 0.013476581172567528\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|arc:challenge|25_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|gsm8k|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hellaswag|10_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T12-10-45.462405.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- '**/details_harness|winogrande|5_2024-01-05T12-10-45.462405.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T12-10-45.462405.parquet'
- config_name: results
data_files:
- split: 2024_01_05T12_10_45.462405
path:
- results_2024-01-05T12-10-45.462405.parquet
- split: latest
path:
- results_2024-01-05T12-10-45.462405.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T12:10:45.462405](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT/blob/main/results_2024-01-05T12-10-45.462405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2919655714473898,
"acc_stderr": 0.0318639028810806,
"acc_norm": 0.2944023668702236,
"acc_norm_stderr": 0.032711391877200874,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610062,
"mc2": 0.4771392382771529,
"mc2_stderr": 0.015567072294317703
},
"harness|arc:challenge|25": {
"acc": 0.3779863481228669,
"acc_stderr": 0.014169664520303101,
"acc_norm": 0.4104095563139932,
"acc_norm_stderr": 0.014374922192642666
},
"harness|hellaswag|10": {
"acc": 0.5435172276438957,
"acc_stderr": 0.00497084669755231,
"acc_norm": 0.7126070503883688,
"acc_norm_stderr": 0.004516215206715344
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416545,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416545
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412417,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918424,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.0259885007924119,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.0259885007924119
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444444,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3393939393939394,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.3393939393939394,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.29743589743589743,
"acc_stderr": 0.02317740813146593,
"acc_norm": 0.29743589743589743,
"acc_norm_stderr": 0.02317740813146593
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547808,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547808
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.032834720561085676,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.032834720561085676
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.32489451476793246,
"acc_stderr": 0.030486039389105303,
"acc_norm": 0.32489451476793246,
"acc_norm_stderr": 0.030486039389105303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349472,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.35887611749680715,
"acc_stderr": 0.017152991797501342,
"acc_norm": 0.35887611749680715,
"acc_norm_stderr": 0.017152991797501342
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.02519018132760841,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.02519018132760841
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961441,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961441
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.31699346405228757,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.31699346405228757,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.0266644108869376,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.0266644108869376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32098765432098764,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.32098765432098764,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2796610169491525,
"acc_stderr": 0.011463397393861973,
"acc_norm": 0.2796610169491525,
"acc_norm_stderr": 0.011463397393861973
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.028245687391462916,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.028245687391462916
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2973856209150327,
"acc_stderr": 0.018492596536396955,
"acc_norm": 0.2973856209150327,
"acc_norm_stderr": 0.018492596536396955
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3551020408163265,
"acc_stderr": 0.03063565515038764,
"acc_norm": 0.3551020408163265,
"acc_norm_stderr": 0.03063565515038764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.031871875379197986,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.031871875379197986
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610062,
"mc2": 0.4771392382771529,
"mc2_stderr": 0.015567072294317703
},
"harness|winogrande|5": {
"acc": 0.6416732438831886,
"acc_stderr": 0.013476581172567528
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
OBF/tokenizer-dataset | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 9600641567.757967
num_examples: 2000000
download_size: 3878879579
dataset_size: 9600641567.757967
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B | ---
pretty_name: Evaluation run of jpechg/Sour-Marcoro-12.5B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jpechg/Sour-Marcoro-12.5B](https://huggingface.co/jpechg/Sour-Marcoro-12.5B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T01:13:13.191577](https://huggingface.co/datasets/open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B/blob/main/results_2024-02-02T01-13-13.191577.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6581308018783164,\n\
\ \"acc_stderr\": 0.03193252324169342,\n \"acc_norm\": 0.6618395212033175,\n\
\ \"acc_norm_stderr\": 0.03257776906432054,\n \"mc1\": 0.5397796817625459,\n\
\ \"mc1_stderr\": 0.017448017223960877,\n \"mc2\": 0.6816993058639923,\n\
\ \"mc2_stderr\": 0.015465736469164977\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892978,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946526\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6563433578968333,\n\
\ \"acc_stderr\": 0.004739575380508865,\n \"acc_norm\": 0.8369846644094802,\n\
\ \"acc_norm_stderr\": 0.0036862475593618374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.03567603799639171,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.03567603799639171\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534443,\n\
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534443\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.0256993528321318,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.0256993528321318\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.022891687984554952,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.022891687984554952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629283,\n \"\
acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629283\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.0234009289183105,\n \
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.0234009289183105\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.02907937453948001,\n \
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.02907937453948001\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.03995524007681681,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.03995524007681681\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507334,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507334\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560406,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560406\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\
\ \"acc_stderr\": 0.016501579306861677,\n \"acc_norm\": 0.41899441340782123,\n\
\ \"acc_norm_stderr\": 0.016501579306861677\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137894,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7932098765432098,\n \"acc_stderr\": 0.022535006705942842,\n\
\ \"acc_norm\": 0.7932098765432098,\n \"acc_norm_stderr\": 0.022535006705942842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472655,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472655\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625162,\n\
\ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625162\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700031,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700031\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5397796817625459,\n\
\ \"mc1_stderr\": 0.017448017223960877,\n \"mc2\": 0.6816993058639923,\n\
\ \"mc2_stderr\": 0.015465736469164977\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047989\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.47687642153146326,\n \
\ \"acc_stderr\": 0.013757748544245326\n }\n}\n```"
repo_url: https://huggingface.co/jpechg/Sour-Marcoro-12.5B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|arc:challenge|25_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|gsm8k|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hellaswag|10_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T01-13-13.191577.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- '**/details_harness|winogrande|5_2024-02-02T01-13-13.191577.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T01-13-13.191577.parquet'
- config_name: results
data_files:
- split: 2024_02_02T01_13_13.191577
path:
- results_2024-02-02T01-13-13.191577.parquet
- split: latest
path:
- results_2024-02-02T01-13-13.191577.parquet
---
# Dataset Card for Evaluation run of jpechg/Sour-Marcoro-12.5B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jpechg/Sour-Marcoro-12.5B](https://huggingface.co/jpechg/Sour-Marcoro-12.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T01:13:13.191577](https://huggingface.co/datasets/open-llm-leaderboard/details_jpechg__Sour-Marcoro-12.5B/blob/main/results_2024-02-02T01-13-13.191577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6581308018783164,
"acc_stderr": 0.03193252324169342,
"acc_norm": 0.6618395212033175,
"acc_norm_stderr": 0.03257776906432054,
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960877,
"mc2": 0.6816993058639923,
"mc2_stderr": 0.015465736469164977
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892978,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946526
},
"harness|hellaswag|10": {
"acc": 0.6563433578968333,
"acc_stderr": 0.004739575380508865,
"acc_norm": 0.8369846644094802,
"acc_norm_stderr": 0.0036862475593618374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.03567603799639171,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.03567603799639171
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534443,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534443
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.0256993528321318,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.0256993528321318
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.0234009289183105,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.0234009289183105
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.03995524007681681,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.03995524007681681
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507334,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507334
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560406,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560406
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861677,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861677
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137894,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7932098765432098,
"acc_stderr": 0.022535006705942842,
"acc_norm": 0.7932098765432098,
"acc_norm_stderr": 0.022535006705942842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472655,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472655
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625162,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625162
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700031,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700031
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5397796817625459,
"mc1_stderr": 0.017448017223960877,
"mc2": 0.6816993058639923,
"mc2_stderr": 0.015465736469164977
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047989
},
"harness|gsm8k|5": {
"acc": 0.47687642153146326,
"acc_stderr": 0.013757748544245326
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
presencesw/dataset2_translated_not_cleaned | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: references
sequence: string
- name: question_vi
dtype: string
- name: answer_vi
dtype: string
- name: references_vi
sequence: string
splits:
- name: train
num_bytes: 6160703.590222222
num_examples: 1011
download_size: 3118646
dataset_size: 6160703.590222222
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_en_conf_gpt_worstscore_reversed | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 81166
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_conf_gpt_worstscore_reversed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713211320 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1510081
num_examples: 3790
download_size: 763956
dataset_size: 1510081
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
reciprocate/tinygsm_mixtral_1M_dedup_full | ---
dataset_info:
features:
- name: question
dtype: string
- name: program
dtype: string
- name: result
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 1246519271
num_examples: 925186
download_size: 364433663
dataset_size: 1246519271
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/aquaint | ---
pretty_name: '`aquaint`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `aquaint`
The `aquaint` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/aquaint#aquaint).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=1,033,461
This dataset is used by: [`aquaint_trec-robust-2005`](https://huggingface.co/datasets/irds/aquaint_trec-robust-2005)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/aquaint', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ..., 'marked_up_doc': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@misc{Graff2002Aquaint,
title={The AQUAINT Corpus of English News Text},
author={David Graff},
year={2002},
url={https://catalog.ldc.upenn.edu/LDC2002T31},
publisher={Linguistic Data Consortium}
}
```
|
autoevaluate/autoeval-eval-phpthinh__data_1-default-4c0514-1832562967 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/data_1
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-560m
metrics: []
dataset_name: phpthinh/data_1
dataset_config: default
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-560m
* Dataset: phpthinh/data_1
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
ademax/extract_metadata_extra_1 | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 137969246.9339555
num_examples: 49301
download_size: 78223750
dataset_size: 137969246.9339555
---
# Dataset Card for "extract_metadata_extra_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/v3_train_free_concat_25 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842555432
num_examples: 2500
download_size: 1781879474
dataset_size: 3842555432
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.